The Immersive Simulation Lab (ISL) focuses on the design and development of advanced interaction techniques: the way people apply devices to perform tasks. Our work emphasizes interaction with three-dimensional virtual environments. When a virtual environment is used to simulate real world actions, it becomes a virtual simulation. ISL has a long history of developing user interfaces for dismounted infantry simulation. Such an interface needs to afford extensive, immediate control over the actions of the user’s avatar: an articulated representation of the user’s body. Our work has focused on simulation for training the cognitive skills (decision-making and team coordination) involved in infantry combat. This translates into allowing trainees to look, move, and shoot with a similar level of performance as they attain in the real world.
The Pointman User Interface
Pointman is ISL’s latest user interface for dismounted infantry simulation. It is a sophisticated desktop interface that makes it easy to precisely control an avatar’s movements. Pointman adds a head tracker and foot pedals to a modern gamepad controller. It engages the user’s head, hands, and feet to control corresponding segments of the avatar’s body. Unlike most infantry simulation interfaces that trigger canned animations, it provides continuous positional control over the avatar’s posture and movement. Pointman supports head-coupled viewing and aiming, reciprocal control over stepping, and continuous control over most aspects of the avatar’s pose, including: leaning in any direction, crouching and weapon hold.
ISL is extending the user’s control over the avatar to support non-verbal communications, without limiting tactical mobility. The interface will capture and portray the user’s arm movements, direction of gaze, and facial expressions via the user’s avatar, represented in real time within a networked virtual simulation.
The Gaiter User Interface
In 1999, ISL developed a virtual locomotion control, called Gaiter, that lets a person move through the Virtual Environment by walking-in-place. The user can move in any direction and control the stride length and cadence of their avatar’s steps in the virtual world. A virtual locomotion technique that makes use of leg motions similar to actual stepping allows people to apply their natural ability to coordinate locomotion with other body motions like turning their head or manipulating objects with their hands.
Beginning in 2002, Gaiter was integrated into an experimental infantry combat simulator for the ONR Virtual Technology and Environments (VIRTE) Close Quarter Battle Training Prototype (Demo II). It was intended to be a fully immersive system that allowed people to employ their whole body to interact in a natural manner in virtual simulation. We tracked the rotation and translation of the user’s major body segments, an instrumented rifle prop, and a head-mounted display (HMD). The user stepped in place to move through the virtual space, and the avatar constantly reflected the user’s upper posture. Shooting was performed by manipulating a realistic rifle prop, rendered graphically in the HMD.
Although this approach appeared promising at first, we found that using natural actions to control the avatar without providing truly realistic sensory feedback disrupts the user’s ability to perform tasks. The disruptive factors include the HMD’s limited field-of-view and resolution, the latency between when the user moves and the avatar moves, and the misalignment between where the virtual rifle appears in the HMD versus how the rifle prop was physically held. We went back to the drawing board. We reevaluated existing user interfaces to first-person shooter games and considered how we could extend them to include the advantages of body-tracked simulation. This led to the development of Pointman.