131 lines
7.8 KiB
Plaintext
131 lines
7.8 KiB
Plaintext
Published in Dec 1992 CyberEdge Journal
|
||
CyberEdge Journal is published by Ben Delaney, bdel@well.sf.ca.us
|
||
|
||
|
||
A SUMMARY OF VIRTUAL ENVIRONMENTS RESEARCH AT UNC-CHAPEL HILL
|
||
|
||
by Mark A. DeLoura, deloura@cs.unc.edu
|
||
|
||
The University of North Carolina at Chapel Hill's Computer Science
|
||
department has been doing research into immersive head-mounted virtual
|
||
environment systems since 1986, when their first head-mounted display
|
||
prototype was completed. Since that time, one of the major goals of the
|
||
department has been improving the realism of virtual worlds, by advancing
|
||
the state of the art in both software and hardware systems.
|
||
|
||
In this article I'll briefly outline UNC's concentrations for the year, as
|
||
well as describe the current system used for developing VR-based
|
||
applications.
|
||
|
||
Fall 1992 sees the continuation of work on building PixelFlow, the newest
|
||
machine in a line of graphics multicomputers built by members of the
|
||
department. PixelFlow, detailed in a SIGGRAPH '92 paper by Fuchs and
|
||
Molnar, will combine partial images produced by multiple independent
|
||
rendering pipelines in a high-speed image composition network to produce
|
||
the final image. Performance of this machine is expected to be linearly
|
||
scalable to well over 10 million anti-aliased, textured polygons per
|
||
second, supporting advanced shading models and multiple shaped light
|
||
sources. A working prototype of the PixelFlow system is expected to be
|
||
operational by early 1994.
|
||
|
||
The current rendering machine used by most VR-based applications in the
|
||
department is Pixel-Planes 5. The Pixel-Planes 5 multicomputer was part of
|
||
the equipment brought to SIGGRAPH '91 by UNC, and was the graphics
|
||
workhorse used in all of the demos that were shown there. (For more
|
||
information on the SIGGRAPH '91 "Tomorrow's Realities" demos, see CyberEdge
|
||
Journal issue #5.) Pixel-Planes 5 is programmed in C or C++ with a subset
|
||
of PHIGS+, and can produce in excess of 2 million Phong-shaded, z-buffered
|
||
triangles per second. VR applications are most commonly built using
|
||
various libraries created by students, such as PPHIGS (graphics), trackerlib
|
||
(tracking mechanisms), adlib (analog/digital devices), and vlib (virtual
|
||
world-specific routines, such as maintenance of standard transformations).
|
||
|
||
The Tracking group has developed a working optoelectronic tracking ceiling,
|
||
made up of many 2- by 2-foot ceiling tiles with 32 infrared LEDs per tile.
|
||
The special head-mounted display used with this ceiling tracker has four
|
||
cameras attached to it which point at the ceiling-- these provide enough
|
||
information for the computer to resolve the user's position to within 2 mm,
|
||
and orientation to 0.2 degrees. Update rates depend on the mode the
|
||
ceiling is in, but 50-80 Hz is typical, as is a lag of 15-30 ms. The
|
||
ceiling is currently 10- by 12-feet, but plans are in the works to increase
|
||
the size of the ceiling to 15- by 30-feet. Research is underway to
|
||
develop a Self-Tracker, which can determine changes in position and
|
||
orientation by viewing the existing environment.
|
||
|
||
Head-mounted displays (HMDs) used by the department include a see-through
|
||
prototype, a video-merge HMD, VPL EyePhones, and the Virtual Research
|
||
Flight Helmet. For more complex user interactions, a variety of
|
||
manipulators are available for use; these include an Argonne Remote
|
||
Manipulator (ARM) force-feedback arm, a billiard ball, a Python joystick, a
|
||
modified bicycle glove, a "wand", and a pair of analog joysticks. All of
|
||
the hand-held input devices and HMDs (except for the optoelectronic tracking
|
||
ceiling) are tracked by Polhemus 6-D magnetic trackers (3SPACE and FASTRAK
|
||
models).
|
||
|
||
Work on software for improving the stability of virtual environments this
|
||
year is being led by Gary Bishop and the HMD group. This year's motto is
|
||
"No Swimming", where swimming refers to the manner in which objects in
|
||
virtual worlds appear to slosh around when the user turns their head.
|
||
Swimming is the visible result of tracker lag, latency in the rendering
|
||
pipeline, and other bottlenecks in the system. Several different areas are
|
||
being actively worked on to improve the images we see in the head-mounted
|
||
display: motion prediction using Kalman filters, beam-racing and
|
||
"just-in-time-pixel display" to get rid of the inaccuracies due to the
|
||
image-scanout time, examination of static and dynamic jitter in the
|
||
trackers, and correction of the distortion in the HMD due to the optics
|
||
used to achieve a wide field-of-view.
|
||
|
||
Aside from the war on swimming objects in virtual worlds, there are several
|
||
applications actively being worked on. The three major application projects
|
||
at this time are the Nanomanipulator, the ultrasound volume-visualization
|
||
project, and the architectural walkthrough.
|
||
|
||
The Nanomanipulator, Russell Taylor's projected Ph.D dissertation topic, is
|
||
a joint project between the UNC Computer Science Department and the UCLA
|
||
Chemistry Department. UCLA provided a Scanning-Tunneling Microscope (STM),
|
||
which Russell has created an inclusive interface to so that one can don the
|
||
HMD and actually change the surface of an object on a molecular level, as
|
||
well as feel the forces of the molecules via the ARM. The display will
|
||
come up on an HMD or projection screen with cross-polarized shutter
|
||
glasses, and the user can interact with either the ARM or the billiard
|
||
ball. The hand-input device has various modes attached to it, which
|
||
include feeling the surface, zooming in on a certain part of it, or
|
||
altering it.
|
||
|
||
The ultrasound project was shown in a paper at SIGGRAPH '92. The
|
||
department has acquired an old ultrasound machine, and the goal is to be
|
||
able to construct a volume-visualization of the object being examined,
|
||
which would then be overlayed on top of live video and viewed with an
|
||
HMD. This would make it seem as if a person had X-ray vision. Testing is
|
||
commonly performed on a baby doll lying in an aquarium in the center of the
|
||
graphics lab, but tests with live subjects have been performed as well.
|
||
Closely associated with this project is the difficulty of overlaying
|
||
computer-generated imagery on top of the real world. The real world is
|
||
inherently real-time, while the computer-generated objects are going to be
|
||
a bit slower due to the various bottlenecks of the tracking and
|
||
image-generation systems. Different approaches for this application are
|
||
being examined, such as using a see-through HMD instead of viewing the
|
||
image overlayed on live video.
|
||
|
||
The architectural walkthrough originally was not in the plan for work this
|
||
year, but this decision was changed when it was pointed out as the only
|
||
application being worked on by UNC that made it apparent when
|
||
graphics algorithms were incorrect. Most people have never seen surfaces
|
||
at a nanometer scale, or complex protein molecules, whereas an indoor scene
|
||
is something which nearly everyone experiences for large durations each day.
|
||
This makes debugging the shading models developed for use on the new
|
||
graphics machines easier to debug, since almost anyone can look at an image
|
||
and tell whether or not it appears realistic. This year's approach to the
|
||
walkthrough deals largely with modelling details. Through a cooperation with
|
||
Virtus Corporation, the Walkthrough project team is developing a much more
|
||
intricate model of the upcoming expansion of Fred Brooks' house. The
|
||
models are created on Virtus Walkthrough software for the Macintosh, and
|
||
they are then uploaded to a Unix machine and converted to a Pixel Planes
|
||
5-specific format. It is the hope that this new model will also be a great
|
||
test for the upcoming PixelFlow machine.
|
||
|
||
Other work being pursued at this time includes the addition of Focal Point
|
||
software for producing directional sound, inclusion of TiNi ticklers for
|
||
tactile feedback, expansion of the current 3DM inclusive world-building
|
||
tool, and continued work on Richard Holloway's excellent vlib package.
|
||
|