Navy Center for Applied Research in Artificial Intelligence
|NRL / Systems / ITD / NCARAI / Facilities||NRL Resources|
The Audio Laboratory has facilities for rendering and analyzing complex sound for military applications. Our rendering systems include a 28 speaker array arranged in 5 rings within a space that is deadened by 16 RPG Diffuser VariScreens. This lets us produce realistic audio environments using pre-recorded or synthesized sound. The rendering system typically uses a three-speaker panning algorithm developed by Ville Pulki called Vector-base amplitude panning. The lab also includes a 100ft2 double walled booth from Industrial Acoustics Company for experiments that need high levels of acoustic isolation. The lab includes wireless transmitters and receivers including a prototype battlefield acoustic sensor that uses binaural microphones arranged in a head-like, throwable device. We have a prototype in-helmet rendering system that enables a soldier to hear spatial sound without having ear-occluding headsets. The analysis capabilities include a B&K pulse system and a B&K head and torso simulator. Computer equipment includes 2 SGI Octane, one with 32 I/O channels, and 2 Crystal River Acoustetrons. The software includes the Virtual Audio Server developed by Hesham Fouad as his Ph.D. from George Washington University, and CATT Acoustic modeling software. Audio equipment includes two 16-channel mixers, two 32-channel graphical equalizers, 3 DAT recorders, 8 Sennheiser microphones and several Sennheiser and Sony headphones.
Robot Learning Lab
The robot laboratory is a 1256-ft2 facility that allows freedom of motion for mobile robots and can be configured with obstacles or furniture to simulate a robot’s expected working environment. The facility maintains 14 commercial mobile robots from companies with a wide following in the robotics community. This enables the integration of outside research from other government, academic, and industry laboratories. The robots include seven Nomadic Technologies robots of varying design and capability for indoor applications and three iRobot allterrain robot vehicles for mixed indoor and outdoor applications. Proprioceptive sensors on the robots include odometry, pitch/roll/yaw sensor, compass, inertial measurement units, and tactile bumpers. Onboard range finders include sonar, active infrared, scanning laser LIDAR, structured light, and stereo vision cameras. Computing facilities include many Sun workstations, Windows PCs, Linux PCs, and Macs in both desktop and notebook models. The robot lab provides an environment for developing and evaluating intelligent software for both actual and simulated autonomous vehicles. Laboratory computers provide environment for testing intelligent algorithms on simulated land, air, and sea vehicles. The mobile robots are also available as test platforms for sensors, interfaces, and other technologies being developed by groups within NRL.
Immersive Simulation Lab
The Immersive Simulation Lab houses specialized equipment for the design, development, and evaluation of individual immersive virtual environments (VE). It is equipped with two real-time optical full-body motion tracking systems: a ten-camera passive optical system from Vicon Motion Systems and an eight-camera active optical system from PhaseSpace, Inc. Computer equipment includes high-end workstations with state-of-the art graphics cards. The Lab has several head mounted displays including the new high resolution stereo nVisor SX model from NVIS, Inc and the lower resolution stereo V8 model from Virtual Research Systems. A plasma display lets large groups of visitors see what is happening in the system, for example showing what the immersed persons sees in the head mount. Because a person stands while immersed in our immersive Infantry Simulator while wearing a head mount, the Lab acquired a harnessing system from Dr. Roger Kaufman at The George Washington University. The harness allows the person to turn while remaining centered in the tracked space. It also manages the head mount cable. Haptic cues may be given to an immersed person through a Tacta Vest developed by Dr. Robert Lindeman at The George Washington University. A simulation may include shooting, which is simulated with an M4 replica AirSoft model rifle. The rifle is instrumented to detect and transmit trigger pulls. To add realism to a military simulation, a person wears a standard issue flak jacket worn by the Marines in combat. Audio equipment includes Sennheiser headphones and a MOTU 828 firewire audio interface. The Lab also has the capability to test real world shooting skills using a laser marksmanship system designed for indoor use by Beamhit LLC. The Lab software is designed in-house using commercial components. The graphics engine is Gamebryo from NDL. The core simulation software is ManSim from Lockheed Martin ASC. Spatialized sound is rendered with the Virtual Audio Server from VRSonic. Also available is JSAF to provide computer-generated adversaries.