Our second-generation BARS backpack
Our second-generation BARS backpack

Military operations are becoming increasingly diverse in their nature. To cope with new and more demanding tasks, the military has researched new tools for use during operations and during training for these operations. There have been numerous goals driving this research over the past several decades. Many of the military requirements and capabilities have specifically driven development of augmented reality (AR) systems.

The overall goal of the Battlefield Augmented Reality System (BARS) was to do for the dismounted warfighter what the Super Cockpit and its successors had done for the pilot. Initial funding came from the Office of Naval Research. The challenges associated with urban environments were a particular concern: complex 3D environment, dynamic situation, and loss of line-of-sight contact of team members. Unambiguously referencing landmarks in the terrain and integrating unmanned systems into an operation can also be difficult for distributed users. All of these examples show the impairment of situation awareness (SA) military operations in urban terrain (MOUT). The belief was that the equivalent of a head-up display would help solve these. By networking the mobile users together and with a command center, BARS could assist a dispersed team in establishing collaborative situation awareness.

This raises numerous issues in system configuration. BARS includes an information database, which can be updated by any user. Sharing information across the area of an operation is a critical component of team SA. We designed an information distribution system so that these updates would be sent across the network. We enabled BARS to be able to communicate with semi-automated forces (SAF) software to address the training issues discussed above. We chose to use commercially-available hardware components so that we could easily upgrade BARS as improved hardware became available. We built UI components so that routes could be drawn on the terrain in the command center application and assigned to mobile users, or drawn by mobile users and suggested to the commander or directly to other mobile users. Typical AR system issues like calibration were investigated. Specific research efforts within BARS for the UI and human factors aspects:

For more information, contact 5581web_info@nrl.navy.mil.


X-Ray Vision and Depth Perception

The indoor (left) and outdoor (right) scenes for the experiment. Visible in these images are the colored referents and a virtual object (floating above the ground plane) which the user was to match in depth with the referent of the same color.  These images show both types of linear perspective cues we introduced.
The indoor (left) and outdoor (right) scenes for the experiment. Visible in these images are the colored referents and a virtual object (floating above the ground plane) which the user was to match in depth with the referent of the same color. These images show both types of linear perspective cues we introduced.

Among the things our initial domain analysis indicated as a potential advantage for AR for dismounted troops was the ability to show where distributed troops were in an urban area of operations. Later, client interest included the abiltiy to communicate points of interest in the environment to distributed team members (without the benefit of line-of-sight contact between team members). Both of these goals require the AR system to identify objects that are occluded from the user. This became a central focus of the BARS research program.

The metaphor of "Superman's X-ray vision" has long been applied to the capability of AR to depict a graphical object that is occluded by real objects. There are three aspects to the problem of displaying cues that correspond to occluded virtual objects. First, the alignment or registration of the graphics on the display must be accurate. This is a defining aspect of AR. Second, the ordinal depth between the real and virtual objects must be conveyed correctly to the user. Because we selected optical see-through HWD for operational reasons, we needed to replace the natural occlusion cue for depth ordering. Third, the metric distance of the virtual object must be understood to within a sufficient accuracy that the user can accomplish the task. This requires the cues that are provided to be sufficiently accurate to estimate distance. Further, each successive aspect depends on the previous ones.

We began our investigation with a study that identified graphical cues that helped convey the ordinal depth of graphical objects. We found that changing the drawing style, decreasing the opacity with increasing distance, and decreasing intensity with increasing distance helped users properly order graphical depictions of buildings that corresponded to occluded buildings on our campus. This task was detached from the real world, however, so our next experiment used a depth-matching task with the same graphical cues. This enabled us to measure metric distance judgments and forced our users to compare the depth of graphical objects to the real environment.

Graph of X-ray vision and depth perception
The graph of unsigned error (light) versus the occlusion metaphor shows that the Tunnel metaphor led to the least amount of error, followed by the Virtual Wall and the Ground Grid. The Edge Map led to the greatest amount of error, followed closely by no occlusion representation ("Empty"). Looking at the signed error (dark) shows that the Tunnel, Virtual Wall, and Edge Map led users to perceive the occluded object as closer than it really was (negative error).

In our most recent experiment, we made one more important change in the experimental design. We used military standard map icons and applied the drawing styles discovered early in our sequence of experiments to these icons. We compared this to several other techniques for displaying occluded information that had appeared in the AR literature. The opacity and drawing style techniques were not as effective as newer techniques. A virtual tunnel built by drawing virtual holes in known occluding infrastructure led to the lowest error in interpreting the ordinal depth of a virtual squad icon amongst real buildings. The next best technique was one we devised for this study, a virtual wall metaphor with the number of edges increasing with ordinal depth. However, both of these techniques led users to perceive the icons as closer than they were intended to be. A ground grid technique which drew concentric circles on the ground plane resulted in the signed error that was closest to zero, even though users made more errors in this condition.


Basic Perception

The measured contrast sensitivity function (CSF) shows that AR displays severely reduce users' visual capabilities relative to their natural vision. The inset next to the legend shows the canonical form of the CSF we have sampled.
The measured contrast sensitivity function (CSF) shows that AR displays severely reduce users' visual capabilities relative to their natural vision. The inset next to the legend shows the canonical form of the CSF we have sampled.

One of the problems encountered by users in our urban skills evaluation was an extreme difficulty in seeing clearly through the video AR display we selected in order to overcome the occlusion issue. This prompted an investigation into exactly how well users could see through the AR displays. We began to consider several aspects of basic perception in AR displays: contrast sensitivity, color perception, and stereo perception.

Contrast sensitivity accounts for the varying size requirements for different levels of contrast. But such a contrast sensitivity function for AR has two forms: one can measure the ability of the user to see the graphics presented on the AR display or measure the ability to see the real environment through the AR display. Some optical see-through displays inhibited the user's ability to see the real world. Some graphical presentations were interpreted at the visual acuity one would expect from the geometric resolution of the display device. Thus it is fair to speculate whether poor visual quality of the display devices could be blamed for difficulties in any of the applications or evaluations we conducted.

The perceived color gamut for three AR displays shows the distortion that optical see-through displays nVisorST (left) and Glasstron (center), as well as video-overlay display ARvision (right), cause in the perception of colors, both of the graphical overlays and of real-world objects.
The perceived color gamut for three AR displays shows the distortion that optical see-through displays nVisorST (left) and Glasstron (center), as well as video-overlay display ARvision (right), cause in the perception of colors, both of the graphical overlays and of real-world objects.

Color perception can also be a key display property for military applications and a particularly novel hazard for optical see-through displays. Black in a rendering buffer becomes transparent on an optical see-through display, allowing the real world to be seen. So one can easily imagine that dark colors will be perceived improperly. But even bright colors do not fully occlude the real-world background, and thus they too are subject to incorrect perception depending on the color (or mix of colors) that appears behind them in the real environment. We measured the color distortion seen in AR displays, two optical see-through and one video. We found that all three displays distorted colors that were seen on white or black backgrounds, and that this occurred with both graphics presented on the displays and real-world targets seen through the displays.

Stereo presentation of graphical images has often been considered a requirement for AR displays. The belief is that in order for the user to perceive graphics as representing 3D objects existing in the surrounding 3D environment, the graphics must be in stereo. One limiting factor in whether a user is able to fuse two images for the left and right eye is vertical alignment. Using nonius lines, we detected errors in alignment ranging from a few hundreths of a degree (well within the tolerance of the human visual system) to four tenths of a degree (an amount that would likely cause eye fatigue or headaches if the user were to force the images to fuse). Simple geometric corrections applied to one eye were sufficient to alleviate these errors. We then were able to measure the stereo acuity that users experience with AR displays, again finding that the differences in depth that a user could detect between real and graphical imagery were well above thresholds in normal human vision of real objects.


Information Filtering

Information filter
Left: Showing all information and labels in the database can overwhelm the user and prevent a Marine from achieving SA. Right: Our information filter uses semantic keys and the concept of area of operations to limit the information shown, which enables the user in this case to discern the location of enemy tanks – spotted by another user – much more easily.

The issue of information overload, as noted above, can become a primary difficulty in MOUT. The physical environment is complex, 3D, and dynamic, with people and vehicles moving throughout. Thus one may think that more information would be of obvious assistance to the military personnel engaged in such operations. But the amount of information can become too much to process in the dynamic pace of military operations, to the point where it inhibits the personnels' ability to complete their assigned tasks. We have thus developed algorithms for restricting information that is displayed to users. Our filtering algorithm evolved from a region-based filter to a hybrid of the spatial model of interaction, rule-based filtering, and the military concept of an area of operations.


Object Selection

In order to query, manipulate, or act upon objects, the user must first select these objects. BARS allows a user to select objects by combining gaze direction (using tracking of the head) with relative pointing within the field of view using a 2D or 3D mouse or eye tracker. The complex nature of the selection operation makes it susceptible to error. In BARS, with the "X-ray vision" paradigm, these occlusion relationships complicate matters more than many applications. To mitigate these errors, we designed a multimodal (speech and gesture) probabilistic selection algorithm. This algorithm incorporates an object hierarchy, several gaze and pointing algorithms, and speech recognition. The algorithms are combined using a weighted voting scheme.


Collaboration

A command-and-control (C2) application might show icons for forces and live sensor (including camera) data over a mixture of satellite imagery and virtual terrain.
A command-and-control (C2) application might show icons for forces and live sensor (including camera) data over a mixture of satellite imagery and virtual terrain.

BARS includes an information database, which can be updated by any user. Sharing information across the area of an operation is a critical component of team SA. We designed an information distribution system so that these updates would be sent across the network according to a priority scheme. We enabled BARS to be able to communicate with semi-automated forces (SAF) software to address the training issues. We built UI components so that routes could be drawn on the terrain in the command center application and assigned to mobile users, or drawn by mobile users and suggested to the commander or directly to other mobile users. Virtual globe applications provide a platform for a command-and-control application; we found Google Earth to be suitable due to the 3D building layer and the API that enabled rapid prototyping of environments and an application. We simulated having sensors in the environment by merging in live camera views onto this simple 3D terrain. We computed the projection of the camera's image onto known geometry to approximate a live view of the environment.


Embedded Training

Virtual forces in the embedded training system must appear to exist in the real world and be subject to gravity and other laws of nature.
Virtual forces in the embedded training system must appear to exist in the real world and be subject to gravity and other laws of nature.

MOUT training requires that trainees operate in urban structures and against other live trainees. Often the training uses simulated small-arms munitions and pits instructors against students in several scenarios. Many military bases have "towns" for training that consist of concrete block buildings with multiple levels and architectural configurations. AR and MR can enhance this training by providing synthetic opposing forces and non-combatants. Using AR for MOUT training is a difficult undertaking. Once one has providing synthetic opposing forces and non-combatants. Using AR for MOUT training is a difficult undertaking. Once one has cleared acceptance and logistics issues, there are many technical challenges to face. Many of these challenges are the same as those as described earlier when AR is used for operations: wearable form factor, accurate tracking indoors and outdoors, and so on. One unique challenge to using AR for training operations is that the simulated forces need to appear on the user's display to give the illusion that they exist in the real world.

Publications

2013 Livingston, M. A., and K. R. Moser, "Effectiveness of Occluded Object Representations at Displaying Ordinal Depth Information in Augmented Reality", IEEE Virtual Reality, Orlando, Florida, 03/2013.  (323.07 KB)
2012 Livingston, M. A., J. L. Gabbard, J. E. Swan II, C. M. Sibley, and J. H. Barrow, "Basic Perception in Head-worn Augmented Reality Displays", Human Factors in Augmented Reality Environments: Springer, 2012.  (573.88 KB)
2012 Rosenblum, L. J., S. K. Feiner, S. J. Julier, J. E. Swan II, and M. A. Livingston, "The Development of Mobile Augmented Reality", Expanding the Frontiers of Visual Analytics and Visualization: Springer, pp. 431-448, 2012.  (520.86 KB)
2012 Livingston, M. A., "Issues in Human Factors Evaluations of Augmented Reality Systems", Human Factors in Augmented Reality Environments: Springer, 2012.  (573.88 KB)
2012 Livingston, M. A., A. Dey, C. Sandor, and B. H. Thomas, "Pursuit of 'X-ray Vision' for Augmented Reality", Human Factors in Augmented Reality Environments: Springer, 2012.  (583.72 KB)
2011 Ai, Z., and M. A. Livingston, "Mission Specific Embedded Training Using Mixed Reality", ASNE Human Systems Integration Symposium, Tysons Corner, Virginia, 10/2011.  (244.75 KB)
2011 Livingston, M. A., Z. Ai, K. Karsch, and G. O. Gibson, "User Interface Design for Military AR Applications", Journal of Virtual Reality, vol. 15, issue 2-3, pp. 175-184, 06/2011.  (445.71 KB)
2011 Ai, Z., and M. A. Livingston, "Mixed Reality on a Virtual Globe", Augmented Reality: Some Emerging Application Areas: InTech Publishing, 2011.  (860.08 KB)
2011 Livingston, M. A., L. J. Rosenblum, D. G. Brown, G. S. Schmidt, S. J. Julier, Y. Baillot, J. E. Swan II, Z. Ai, and P. Maassel, "Military Applications of Augmented Reality", Handbook of Augmented Reality: Springer, 2011.  (828.54 KB)
2009 Livingston, M. A., Z. Ai, and J. Decker, "A User Study Towards Understanding Stereo Perception in Head-worn Augmented Reality Displays", IEEE International Symposium on Mixed and Augmented Reality, Orlando, Florida, 10/2009.  (804.52 KB)
2009 Ai, Z., and M. A. Livingston, "Integration of Georegistered Information on a Virtual Globe", IEEE International Symposium on Mixed and Augmented Reality, Orlando, Florida, 10/2009.  (144.92 KB)
2009 Klein, E., J. E. Swan II, G. S. Schmidt, M. A. Livingston, and O. G. Staadt, "Measurement Protocols for Medium-Field Distance Perception in Large-Screen Immersive Displays", IEEE Virtual Reality, Lafayette, Louisiana, pp. 14-18, 03/2009.  (428.26 KB)
2009 Livingston, M. A., J. H. Barrow, and C. M. Sibley, "Quantification of Contrast Sensitivity and Color Perception Using Head-worn Augmented Reality Displays", IEEE Virtual Reality, Lafayette, Louisiana, 03/2009.  (530.21 KB)
2009 Livingston, M. A., Z. Ai, J. E. Swan II, and H. S. Smallman, "Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality", IEEE Virtual Reality, Lafayette, Louisiana, pp. 14-18, 03/2009.  (541.65 KB)
2008 Livingston, M. A., and Z. Ai, "The Effect of Registration Error on Tracking Distant Augmented Objects", IEEE International Symposium on Mixed and Augmented Reality, Cambridge, UK, 09/2008.  (14.23 MB)
2007 Swan II, J. E., A. Jones, E. Kolstad, M. A. Livingston, and H. S. Smallman, "Egocentric Depth Judgments in Optical, See-through Augmented Reality", IEEE Transactions on Visualization and Computer Graphics, vol. 13, issue 3, pp. 429-442, 05-06/2008, 2007.  (542.33 KB)
2006 Livingston, M. A., "Quantification of Visual Capabilities using Augmented Reality Displays", International Symposium on Mixed and Augmented Reality, Santa Barbara, CA, 10/2006.  (1.07 MB)
2006 Livingston, M. A., D. G. Brown, S. J. Julier, and G. S. Schmidt, "Mobile Augmented Reality: Applications and Human Factors Evaluations", NATO Human Factors and Medicine Panel Workshop on Virtual Media for Military Applications, West Point, New York, pp. 13-15, 06/2006.  (2.43 MB)
2006 Livingston, M. A., S. J. Julier, and D. G. Brown, "Situation Awareness for Teams of Dismounted Warfighters and Unmanned Vehicles", Enhanced and Synthetic Vision Conference, SPIE Defense and Security Symposium, Orlando, Florida, 04/2006.  (284.02 KB)
2006 Schmidt, G. S., D. G. Brown, E. B. Tomlin, J. E. Swan II, and Y. Baillot, "Toward Disambiguating Multiple Selections for Frustum-Based Pointing", 3D User Interface Symposium, Alexandria, Virginia, 03/2006.  (1.24 MB)
2006 Brown, D. G., R. Stripling, and J. T. Coyne, "Augmented Reality for Urban Skills Training", IEEE Virtual Reality, Alexandria, Virginia, 03/2006.  (307.38 KB)
2006 Swan II, J. E., M. A. Livingston, H. S. Smallman, D. G. Brown, Y. Baillot, J. L. Gabbard, and D. Hix, "A Perceptual Matching Technique for Depth Judgments in Optical, See-through Augmented Reality", IEEE Virtual Reality, Alexandria, Virginia, 03/2006.  (322.29 KB)
2006 Livingston, M. A., A. Lederer, S. R. Ellis, S. M. White, and S. K. Feiner, "Vertical Vergence Calibration for Augmented Reality Displays", IEEE Virtual Reality, Alexandria, Virginia, 03/2006.  (3.58 MB)
2006 Julier, S. J., D. G. Brown, M. A. Livingston, and J. Thomas, "Adaptive Synthetic Vision", Enhanced and Synthetic Vision Conference, SPIE Defense and Security Symposium, Orlando, Florida, 2006.  (442.86 KB)
2005 Brown, D. G., Y. Baillot, M. P. Bailey, K. C. Pfluger, P. Maassel, J. Thomas, and S. J. Julier, "Using Augmented Reality to Enhance Fire Support Team Training", Interservice/Industry Training, Simulation, and Education Conference, Orlando, Florida, 12/2005.  (871.22 KB)
2005 MacIntyre, B., and M. A. Livingston, "Moving Mixed Reality into the Real World", IEEE Computer Graphics & Applications, 6, vol. 25, pp. 22-23, 11/2005.  (366.45 KB)
2005 Livingston, M. A., "Evaluating Human Factors in Augmented Reality Systems", IEEE Computer Graphics & Applications, 6, vol. 25, pp. 12-15, 11/2005.  (346.26 KB)
2005 Livingston, M. A., C. Zanbaka, J. E. Swan II, and H. S. Smallman, "Objective Measures for the Effectiveness of Augmented Reality", IEEE Virtual Reality (Poster Session), Bonn, Germany, 03/2005.  (154.98 KB)
2005 Livingston, M. A., D. G. Brown, J. E. Swan II, B. Goldiez, Y. Baillot, and G. S. Schmidt, "Applying a Testing Methodology to Augmented Reality Interfaces to Simulation Systems", Western Simulation Multiconference, New Orleans, LA, 01/2005.  (489.26 KB)
2005 Brown, D. G., Y. Baillot, K. C. Pfluger, S. J. Julier, and M. A. Livingston, "Virtual Targets for the Real World", NRL Review, 2005.  (359.43 KB)
2004 Brown, D. G., Y. Baillot, S. J. Julier, P. Maassel, D. Armoza, M. A. Livingston, and L. J. Rosenblum, "Building a Mobile Augmented Reality System for Embedded Training: Lessons Learned", Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '04), Orlando, Florida, 12/2004.  (794.24 KB)
2004 Goldiez, B., M. A. Livingston, J. Dawson, D. G. Brown, P. Hancock, Y. Baillot, and S. J. Julier, "Advancing Human-Centered Augmented Reality Research", Army Science Conference, Orlando, Florida, 12/2004.  (198.16 KB)
2004 Coelho, E M., S. J. Julier, and B. MacIntyre, "OSGAR: A Scene Graph with Uncertain Transformations", Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, Virginia, 11/2004.  (1.01 MB)
2004 Livingston, M. A., J. E. Swan II, S. J. Julier, Y. Baillot, D. G. Brown, L. J. Rosenblum, J. L. Gabbard, T. H. Hollerer, and D. Hix, "Evaluating System Capabilities and User Performance in the Battlefield Augmented Reality System", Performance Metrics for Intelligent Systems Workshop, Gaithersburg, Maryland, 08/2004.  (620.97 KB)
2004 Armoza, D., and D. G. Brown, "Embedded Mobile Augmented Reality Trainer Within a Distributed HLA Simulation", Simulation Interoperability Workshop, Alexandria, Virginia, 04/2004.  (290.56 KB)
2004 Brown, D. G., S. J. Julier, Y. Baillot, M. A. Livingston, and L. J. Rosenblum, "Event-Based Data Distribution for Mobile Augmented Reality and Virtual Environments", Presence: Teleoperators and Virtual Environments, vol. 13, issue 2, pp. 211-221, 04/2004.  (198.71 KB)
2004 Hix, D., J. L. Gabbard, J. E. Swan II, M. A. Livingston, T. H. Hollerer, S. J. Julier, Y. Baillot, and D. G. Brown, "A Cost-Effective Usability Evaluation Progression for Novel Interactive Systems", Hawaii International Conference on System Sciences (HICSS-37), 01/2004.  (396.87 KB)
2003 Brown, D. G., Y. Baillot, S. J. Julier, D. Armoza, J. Eliason, M. A. Livingston, L. J. Rosenblum, and P. Garrity, "Data Distribution for Mobile Augmented Reality in Simulation and Training", Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '03), Orlando, Florida, 12/2003.  (1.14 MB)
2003 Baillot, Y., S. J. Julier, D. G. Brown, and M. A. Livingston, "A Tracker Alignment Framework for Augmented Reality", International Symposium on Mixed and Augmented Reality (ISMAR '03), Tokyo, Japan, 10/2003.  (393.84 KB)
2003 Julier, S. J., M. A. Livingston, J. E. Swan II, Y. Baillot, and D. G. Brown, "Adaptive User Interfaces in Augmented Reality", Software Technology for Augmented Reality Systems, Tokyo, Japan, 10/2003.
2003 Livingston, M. A., J. E. Swan II, J. L. Gabbard, T. H. Hollerer, D. Hix, S. J. Julier, Y. Baillot, and D. G. Brown, "Resolving Multiple Occluded Layers in Augmented Reality", International Symposium on Mixed and Augmented Reality (ISMAR '03), Tokyo, Japan, 10/2003.  (142.56 KB)
2003 Gabbard, J. L., D. Hix, J. E. Swan II, M. A. Livingston, T. H. Hollerer, and S. J. Julier, "Usability Engineering for Complex Interactive Systems Development", Engineering for Usability, Human Systems Integration Symposium 2003, Vienna, Virginia, 06/2003.  (318.26 KB)
2003 Baillot, Y., J. Eliason, G. S. Schmidt, J. E. Swan II, D. G. Brown, S. J. Julier, M. A. Livingston, and L. J. Rosenblum, "Evaluation of the ShapeTape Tracker for Wearable, Mobile Interaction", IEEE Virtual Reality 2003, Alexandria, Virginia, 03/2003.  (116.63 KB)
2003 Brown, D. G., S. J. Julier, Y. Baillot, and M. A. Livingston, "An Event-Based Data Distribution Mechanism for Collaborative Mobile Augmented Reality and Virtual Environments", IEEE Virtual Reality 2003, Los Angeles, California, pp. 23-29, 03/2003.  (986.88 KB)
2002 Livingston, M. A., L. J. Rosenblum, S. J. Julier, D. G. Brown, Y. Baillot, J. E. Swan II, J. L. Gabbard, and D. Hix, "An Augmented Reality System for Military Operations in Urban Terrain", Proceedings of the Interservice / Industry Training, Simulation, & Education Conference (I/ITSEC '02), Orlando, Florida, 12/2002.  (176.21 KB)
2002 Julier, S. J., M. Lanzagorta, Y. Baillot, and D. G. Brown, "Information Filtering for Mobile Augmented Reality", Projects in VR, IEEE Computer Graphics & Applications, vol. 22, issue 5, pp. 12-15, 09-10/2002.  (946.18 KB)
2002 MacIntyre, B., E M. Coelho, and S. J. Julier, "Estimating and Adapting to Registration Errors in Augmented Reality Systems", Technical Papers, IEEE Virtual Reality 2002, Orlando, Florida, pp. 73-80, 03/2002.  (557.41 KB)
2002 Gabbard, J. L., J. E. Swan II, D. Hix, M. Lanzagorta, M. A. Livingston, D. G. Brown, and S. J. Julier, "Usability Engineering: Domain Analysis Activities for Augmented Reality Systems", The Engineering Reality of Virtual Reality 2002, , vol. SPIE Volume 4660, Stereoscopic Displays and Virtual Reality Systems IX, pp. 445-457, 01/2002.  (488.13 KB)
2001 Azuma, R., Y. Baillot, R. Behringer, S. Feiner, S. J. Julier, and B. MacIntyre, "Recent Advances in Augmented Reality", IEEE Computer Graphics & Applications, vol. 21, issue 6, pp. 34-47, 11-12/2001.  (2.23 MB)
2001 Hollerer, T. H., S. K. Feiner, D. Hallaway, B. Bell, M. Lanzagorta, D. G. Brown, S. J. Julier, Y. Baillot, and L. J. Rosenblum, "User interface management techniques for collaborative mobile augmented reality", Computers & Graphics, vol. 25, issue 5, pp. 799-810, 10/2001.  (3.15 MB)
2001 Baillot, Y., D. G. Brown, and S. J. Julier, "Authoring of Physical Models Using Mobile Computers", International Symposium on Wearable Computers, Zurich, Switzerland, 10/2001.  (368.72 KB)
2001 Rosenblum, L. J., "Geospatial Requirements for Mobile Augmented Reality Systems", National Research Council Workshop on the Intersection of Geospatial Information and Information Technology, 10/2001.
2001 Julier, S. J., D. G. Brown, and Y. Baillot, "The need for AI: Intuitive User Interfaces for Mobile Augmented Reality Systems", AIMS 2001 (2nd Workshop on AI in Mobile Systems), Seattle, Washington, 08/2001.  (1.58 MB)
2001 Julier, S. J., Y. Baillot, M. Lanzagorta, L. J. Rosenblum, and D. G. Brown, "Urban Terrain Modeling for Augmented Reality Applications", 3D Synthetic Environments Reconstruction, Dordrecht, The Netherlands, Kluwer Academic Publishers, pp. 119-136, 2001.  (1.04 MB)
2000 Julier, S. J., Y. Baillot, M. Lanzagorta, D. G. Brown, and L. J. Rosenblum, "BARS: Battlefield Augmented Reality System", NATO Symposium on Information Processing Techniques for Military Systems, Istanbul, Turkey, 10/2000.  (109.35 KB)
2000 Julier, S. J., M. Lanzagorta, Y. Baillot, L. J. Rosenblum, S. K. Feiner, T. H. Hollerer, and S. Sestito, "Information Filtering for Mobile Augmented Reality", IEEE International Symposium on Augmented Reality 2000, Munich, Germany, pp. 3-11, 10/2000.  (132.32 KB)
2000 Sestito, S., S. J. Julier, M. Lanzagorta, and L. J. Rosenblum, "Intelligent Filtering for Augmented Reality", SimTect 2000, Sydney, Australia, 02-03/2000.  (64.16 KB)