2005 Proceedings
2005 Photos
|
|
Stereoscopic Displays and Virtual Reality Systems XII (2005)
Proceedings of the SPIE Volume 5664
Introduction
Welcome to Stereoscopic Displays and Virtual Reality Systems XII, Proceedings of Electronic
Imaging volume 5664. These proceedings combine in one volume the papers from two
separate but complementary conferences: Stereoscopic Displays and Applications XVI
and The Engineering Reality of Virtual Reality 2005. These conferences were two of the 23
conferences that comprised the 2005 IS&T/SPIE Electronic Imaging: Science and
Technology Symposium, held at the San Jose Convention Center, San Jose, California, USA,
in January 2005.
Stereoscopic Displays and Applications
This year's Stereoscopic Displays and Applications (SD&A) conference was the 16th in the
series. The conference, held during the three-day period 17 to19 January 2005, featured a
broad range of topics, presentations, and events.
The first day of the SD&A conference started with a session titled "Convergence
Accommodation Issues", chaired by Andrew Woods. Most stereoscopic displays are
unable to provide changes in accommodation (focus) corresponding to changes in
convergence. This session included papers that discussed new displays capable of
producing changes in accommodation, thus preserving the link between convergence
and accommodation of human vision. Many people were impressed with the description
and images provided of a 128-view autostereoscopic display constructed at the Tokyo
University of Agriculture and Technology. Understandably the actual display could not be
transported to San Jose for the demonstration session, due to its size and complexity, but
many attendees were left wishing they could be immediately teleported to Japan to see it.
The second session of the conference, titled "Human Factors," was chaired by John Merritt.
Three of the papers presented in this session discussed topics that included: stereo-foveation,
accommodative load, and smoothness of multi-view images. A standby paper
on a multi-view autostereoscopic display was also presented.
The third session of the conference, "Stereoscopic Image Processing," was chaired by Mike
Weissman. This was again a very popular session, and the five papers presented discussed
a wide range of topics, including arbitrary viewpoint imaging, rendering gaseous
substances, depth mapping, stereoscopic video coding, and recovering missing colors
from stereoscopic images.
The fourth session of the conference, "Autostereoscopic Displays," was chaired by Neil
Dodgson. The five papers in this session discussed various ways of implementing and
optimizing displays that can present stereoscopic images to an observer without the need
for the observer to wear any viewing apparatus (e.g. glasses). Displays discussed included
two-view and multi-view autostereoscopic displays.
The final formal session of the day was the 3D Video Screening Session, chaired by Andrew
Woods and Chris Ward. The purpose of this regular session is to showcase examples of how
3D video is being used and produced around the world. This year the following 3D material
(or segments thereof) was screened on the conference's high-quality polarized
stereoscopic rear-projection system:
- "Fish" by NHK (Japan) - a 5-minute piece showing the wonderful undersea life at
Zamami, Okinawa, Japan. [*8C] [This code represents the 3D format and the playback system - see the explanation following.]
- "Robogirl" and "Choices Homero" by Lightspeed Design Group (USA) - two short
computer animated pieces. [*8C]
- "Giants Exist" by Continuum Resources (Australia) for Whale World (Australia) - a 16-
minute documentary style piece about whales on the southwest coast of Australia.
[*5C]
- "3D Aurora" by Brian McClave (UK) and George Millward - an experimental piece
showing actual stereoscopic video of the Aurora Borealis (Northern Lights). [*2C]
- "Metro Safety Kids" by Dynamic Digital Depth (USA and Australia) for the Los
Angeles Metro Transit Authority (USA) - a predominantly computer-animated piece
that teaches kids how to travel safely on the Los Angeles Metro rail system. The
piece also included compositing of real-world video into the computer-animated
world and 2D to 3D conversion of real-world video. [*3C]
- "The Bermuda Triangle Undersea Adventure" by Powderkeg (USA) - a 4.5-minute
computer-animated underwater ride-film. [*8C]
- "The Creeps" - a movie originally filmed in 3D in 1997 for cinema release and
recently released in field-sequential 3D DVD format by ‘3D for your TV' (USA). [*1A]
- "Alaska 3D - Flora, Fauna and Fishin! (teaser)" and "Carstensz Pyramid - Adventure in
Irian Jaya (teaser)" by Tom Riederer, TreeD films. 3D adventure videos shot in exotic
locations. [*1C]
- "Avandavision 2004" by 21 st Century 3D (USA). A piece designed to advertise a new
Avanda pharmaceutical to doctors and sales representatives. [*4B]
- "Swimming with Phytoplankton" by Iona Scott for Kew Gardens (UK) - a short
computer-animated piece illustrating various forms of phytoplankton. [*7C]
- "Barney (the Owl) in 3D" (informal title) by Inition (UK) for the Royal Society for the
Protection of Birds (RSPB) (UK) - a short piece illustrating the difference between the
life of Barney the owl and a typical human and their effects on the environment.
[*6C]
- "Toyota product shots" (informal title) by Cobalt Entertainment (USA) - a beautifully
filmed piece showing various models of Toyota vehicles in an advertising style
compilation. [*8C]
- "Moon Man" by The National Film Board of Canada (NFB) - a stereoscopic
computer animation inspired by the Canadian folk song "Moonman Newfie" about
Codfish Dan, a folk hero fishing on the Milky Way. The animation was developed
using an IMAX SANDDE stereoscopic animation workstation. [*8C]
- "Falling in Love Again" by The National Film Board of Canada (NFB) - a delightfully
amusing computer animated piece set to the music "Falling in Love Again" sung by
Marlene Dietrich. A playful take on the vertigo of falling in love. The animation was
developed using an IMAX SANDDE stereoscopic animation workstation. [*8C]
As in previous years, a wide range of 3D video sources and playback systems were used,
including:
[*1] Field-sequential 3D NTSC
[*2] Field-sequential 3D PAL
[*3] Dual-channel 3D NTSC
[*4] "960P" Dual-channel progressive 3D NTSC
[*5] Dual-channel 3D PAL
[*6] Dual-channel widescreen 3D PAL
[*7] Dual-channel 3D 800x600
[*8] Dual-channel 3D High-Definition (HD) Video
[*A] played back from a single standard consumer DVD player and demultiplexed to
the two video projectors by a 3D Video Demultiplexer
[*B] played back from the "960P" synchronized dual DVD playback system
[*C] played back from the DepthQ 3D Cinema Server
The evening concluded with an enjoyable meal at the BoTown Chinese Restaurant in
downtown San Jose. It was a good chance for a large number of the conference
attendees to mix and talk in a relaxed atmosphere.
The second day of the SD&A conference commenced with a symposium-wide plenary
session that was particularly related to the SD&A conference. Dr. Justin Maki from the Jet
Propulsion Laboratory (Pasadena, California) presented a talk titled "20 Cameras on Mars:
The Mars Exploration Rover Imaging System". Dr. Maki's talk reviewed the camera
hardware on the rovers (which includes 4 stereoscopic camera pairs on each rover), and
also the history of the mission from the landing of the first of the two rovers on Mars in
January 2004 (including the "Cosmic Hole-in-One," whereby one of the rovers landed by
chance in the middle of a small crater) up to the present day. He also provided an inside
view of how the staff at JPL processes the data received from the two rovers and use this
data to make mission decisions. The talk included polarized stereoscopic projection of a
selection of stereoscopic images from the mission to the approximately 600-strong
audience. Images were shown from all of the rovers' stereoscopic cameras and included a
number of stitched stereoscopic panoramas and a number of computer-generated
stereoscopic images created from special usage of the microscopic imaging camera. The
talk was well received and provided an important insight into how stereoscopic imaging is
being used in planetary science.
The first technical session of the SD&A conference for the day was "2D to 3D conversion"
chaired by Gregg Favalora. The two papers in this short session discussed two methods,
automatic and manual, of converting 2D images or video to 3D.
The next session was "Stereoscopic Video" chaired by Andrew Woods. The four papers in
this session covered topics including HD stereoscopic video cameras, pre-rendering of
stereoscopic video animations, autostereoscopic monitor image pattern creation
methods, and a method of showing 3D video on integral 3D displays.
After lunch a session on "Stereoscopic Developments" was chaired by Vivian Walworth.
The session contained four papers that discussed a wide selection of topics relating to new
methods for displaying stereoscopic images.
This was followed by a short two-paper session, "Depth Mapping," chaired by Vivian
Walworth. The purpose of this session was to provide focus on methods for extracting and
processing depth maps for stereoscopic content.
The final session of the day featured our ever-popular Demonstration Session. This session is
the perfect chance for attendees and visitors to obtain a hands-on and eyes-on
experience of the latest in stereoscopic displays and imaging systems. It was pleasing to
see such a large array of different stereoscopic imaging systems on display and an even
larger audience actively engaging with the various displays.
This year the following items were on show at the demonstration session:
- Robert-Paul Berretty and Frans Peters from Philips (Netherlands) demonstrated a new
Philips multi-view autostereoscopic display based on a switched lenticular lens array
filled with liquid crystal.
- Nic Beames from Dynamic Digital Depth (Australia and USA) demonstrated DDD
software "TriDef Autostereo 3D tools" and "TriDef Player" on a newly released Sharp
Mebius PC-AL3DH autostereoscopic 3D laptop, and a 30" µPol 3D LCD display from
Arisawa (Japan), viewed using polarized 3D glasses.
- Igor Troitski demonstrated a large selection of laser-induced volumetric images
inside crystals.
- RAFAEL (Israel) demonstrated automatic video-to-stereoscopic-video conversion
software.
- Nick Holliman from University of Durham (UK) demonstrated new methods for
creating stereoscopic images with controlled perceived depth on a Sharp RD3D
laptop.
- Hongen LIAO from University of Tokyo (Japan) demonstrated a long viewing
distance integral photography autostereoscopic display.
- Peter Wimmer from Johannes Kepler University (Linz, Austria) demonstrated his
shareware "Stereoscopic Player" and "Stereoscopic Multiplexer" software.
Connected to his laptop were a stereo-pair of Sony handycams and also a
stereopair of USB web-cam eye-ball cameras that were set up to show live
stereoscopic video on the laptop screen in anaglyph format.
- Liang Zhang from Communications Research Centre (CRC) (Canada)
demonstrated the results of their study into disparity estimation and multi-view video
generation.
- Mark Feldman from StereoGraphics (California) demonstrated their "SynthaGram
404" (40" autostereoscopic LCD), "SynthaGram 202" (20" autostereoscopic LCD) and
their Photoshop "3D imaging" plug-in.
- Jason Goodman from 21st Century 3D (New York) demonstrated the "3DVX"
stereoscopic video camera.
- Kazuki Taira from Toshiba Corporation (Japan) demonstrated a prototype
autostereoscopic display on a Toshiba notebook computer.
- Alan Sullivan from Lightspace Technologies (Connecticut) demonstrated the
DepthCube 3D volumetric display.
- John Miller and Brad Nelson from Dep3D (California) demonstrated a variety of
stereoscopic PC games and applications on their 40" dual rear projection circular
polarized stereoscopic display.
- Scott Robinson and Chuck McLaughlin from Planar Systems (Oregon) and
McLaughlin Consulting Group (California) demonstrated the StereoMirror 3D display.
- Steve Schklair and Bernie Butler-Smith from Cobalt Entertainment (California)
demonstrated their dual 720P high-definition stereoscopic video camera.
- Andrew Woods from Curtin University of Technology (Australia) and Tony Hall from
Welaptega Marine Ltd (Canada) demonstrated their 4000m depth rated
underwater stereoscopic video camera and example stereoscopic video filmed
with the underwater camera played back from a field-sequential 3D DVD on a
Sharp RD3D autostereoscopic laptop.
- Steven Smith from VRex Malaysia and Mike Roche from VRex USA introduced the
"AutoBin Clipon," a product that allows for the after-market attachment, via a
magnetically enabled frame, of either a TNµPol (twisted nematic micro-polarizer) or
a two-view autostereoscopic parallax barrier to allow a standard LCD to be used for
stereoscopic display. Introduced also was a 17" TNµPol micro-polarizer, which allows
a range of 17" consumer LCD displays to be used for stereoscopic display. They also
demonstrated SterVu TM - a 2D to 3D image conversion software suite.
- Samuel Zhou from IMAX Corporation (Canada) provided a range of technical
literature about the IMAX 3D process and showed movie posters from two recent
IMAX 3D movies: "The Polar Express" and "NASCAR 3D".
- Steve Berezin from Berezin Stereo Photography (California) demonstrated a wide
variety of consumer stereoscopic products, including various 3D glasses, viewers,
books and software.
- Chris Chaleki from Progressive 3D (Maine) demonstrated a digital stereoscopic
video camera (1024x768 x2) with camera-link interface.
- Alan Silliphant from Anachrome 3D (California) demonstrated the Anachrome 3D
glasses and images.
Pictures of the demonstrations listed above are available at the conference website:
www.stereoscopic.org
Following the demonstration session, the 13 SD&A conference poster authors presented
their posters in the symposium-wide poster session.
Also on display on Tuesday and Wednesday was a Phantogram Exhibit organized as an
event of both the Electronic Imaging Symposium and the Stereoscopic Displays and
Applications conference. The Phantogram Exhibit collected the works of a wide selection
of artists and authors to form the largest ever collection of phantograms in one exhibit;
about 100 phantograms were on display. For those not aware of what phantograms are,
they are a relatively new art form based on the use of standard stereoscopic display
techniques but with a special geometric modification that makes the images look as
though they are part of the real world viewing volume in front of you. Phantograms are
usually laid flat on a table and viewed from a 45-degree angle (however there are other
types). If constructed well, the virtual three-dimensional images look as though they are
sitting there right in front of you - part of your world, not a virtual world. A large part of the
exhibit was dedicated to phantograms of the Mars surface - constructed from
stereoscopic images captured by the Mars Exploration Rovers Spirit and Opportunity. It was
fortunate that we were able to exhibit these images on the same day as the plenary
presentation by Dr. Justin Maki from JPL on the topic of the Mars Exploration Rovers. The
phantogram exhibit included works from the following authors and artists: Achim Bahr
(Germany); Boris Starosta, Starosta.com; Steve Aubrey; Owen (Wes) Western, 3D on the
Level; John Adlersparre, Magic Mosaics (Canada); Gilbert Detillieux (Canada); Takashi
Sekitani, StereoEye (Japan); Steve Hughes; Terry Wilson, Terryfic3D; and Andrew Woods
(Australia). We appreciate the permission of Steve Aubrey and Owen (Wes) Western, who
each hold patents relevant to the topic of phantograms. The phantogram exhibit was
coordinated by Terry Wilson and Andrew Woods.
The third day of the SD&A conference started with a discussion forum on the topic "Crimes
against 3D". The forum was chaired by Lenny Lipton (StereoGraphics) with panelists John
Rupkalvis (Stereoscope International), Samuel Zhou (IMAX Corporation) and Josh Greer
(Real D). Paraphrasing Lenny Lipton's opening remarks:
The title of this discussion was chosen to provoke discussion. Because of the people
on the panel the discussion will probably be motion picture centric but it can be
much broader. As I see it, the discussion can fall into two categories - one would be
system design involving primarily engineering, and the other portion of concern
would be how the pictures are captured and generated. 3D has had a long and
troubled history since 1838, when Wheatstone announced the discovery of
stereopsis and the invention of the stereoscope. Over the years stereoscopy has
come and gone. In the Victorian era the stereoscope was the equivalent of today's
television. But I recall I found in an 1898 magazine a critique of stereo mounting
complaining that the cards were improperly mounted. Skipping ahead, Polaroid
Corporation put a lot of effort into stereoscopic projection with polarized images in
the 1950s. There was a large boom of 3D movies peaking in 1953 but it did not
continue long term. What we have today is a growing number of 3D theatres in
theme parks and IMAX 3D theatres, and attempts are now being made to
reintroduce 3D into the mainstream theatrical cinemas. Having said that, as I say to
my kids, you have to accentuate the positive, eliminate the negative, and don't
mess with Mr. In-between. What do you perceive to be the greatest stumbling block,
what needs to be overcome, what crime against 3D needs to be addressed?
A spirited and well-intentioned discussion ensued. The discussion forum received very good
input from the audience, and the panel responded with good humor and good insight.
There was general consensus that we all need to work hard (as creators and critics) to
maintain the high quality of 3D (systems and content) now possible and educate the
purveyors and consumers of 3D so as to avoid poor-quality 3D.
The first technical session of the day was on "Volumetric 3D Displays" and was chaired by
Gregg Favalora. The four papers in the session discussed topics including using lasers to
produce permanent static volumetric images inside large crystals, new technologies for
displaying volumetric images, and methods of interacting with data displayed in
volumetric 3D displays. The final paper of this session showed some particularly good videos
that illustrated well the interaction methods discussed in the paper.
This year, for the first time in the history of the SD&A conference, it was necessary to run two
parallel sessions in order to squeeze in some more papers that we felt should be presented
at the conference. One of the parallel sessions was titled "Integral 3D Displays" and was
chaired by Nic Holliman (University of Durham, UK). Integral 3D displays are a blossoming
area of interest, and the four papers in this session discussed a broad range of methods for
displaying integral 3D images as well as processing of integral 3D image data. The other
session held at the same time was "Telemanipulator and Telepresence Technologies,"
chaired by Andrew Woods and Ian McDowall. This session was born out of our attempt to
continue the "Telemanipulator and Telepresence Technologies" conference series, which
was last held in Boston at Photonics East in 2001. Unfortunately we did not receive sufficient
papers to run an entire conference on this topic. However, we still did wish to have a
selection of those papers presented. Hence this session was held as a joint session between
the Stereoscopic Displays and Applications conference and The Engineering Reality of
Virtual Reality conference. The first of three papers discussed a small-scale study
conducted in conjunction with Intuitive Surgical to explore the performance changes
when the stereoscopic camera separation is reduced in a laparoscopic surgery
environment. The second paper was intriguing and showed the ability to stream
panoramic video at a resolution of 1344x672 from a 6-view camera. The video was
streamed into a web browser that provided mouse-type interaction to explore the remote
location. The final paper in this session, which outlined an implementation of a flammability
model, showed the integration of that model into a fire training application.
The next session was "Stereoscopic Display Applications," chaired by John Merritt. The
single paper presented in this session discussed the historical and developing usage of
stereoscopy in orthopedics. The author reviewed the many benefits that can apply in this
field and, on a practical level, what tools can now be used to implement stereoscopy. The
other two papers originally scheduled for this session were not presented.
The final session and highlight of this year's Stereoscopic Displays and Applications
conference was the Keynote Presentation titled "Digital Technology and the Resurgence of
Commercial Stereoscopic Entertainment," presented by Steve Schklair from Cobalt
Entertainment.
Not since the 1970s has there been so much activity in stereoscopic cinema in the
commercial and corporate film sector. Advances in digital technologies enabling
both efficient capture and exhibition have eliminated most of the obstacles faced
by previous generations of 3D entertainment. At the front end are new high-resolution
digital cameras on "intelligent" motion-controlled shooting platforms. In
the middle are new digital post-production tools. At the back end is the coming
rollout of commercial digital cinemas accompanied by recent developments in
stereoscopic projection technologies. Taken together, these tools and techniques
are behind the new wave of stereoscopic filmmaking that is changing the
paradigm of the mainstream Hollywood movie business.
In Steve's presentation he stressed that the future is bright for the stereoscopic world, but it
is also a world that will demand uncompromising quality. And as the audiences become
more visually and stereoscopically literate, they will continue to expect higher and higher
standards. Steve's presentation was interspersed with the stereoscopic projection of some
of his recent 3D HD works. Titles shown on the specially setup 14 x 8-foot front-silvered
screen included "NFL Football 3D Test", "Toyota product shots", and "Superbowl to the Max
trailer". We thank Steve for his insightful presentation and look forward to seeing more of his
3D HD footage at future conferences.
Many individuals and companies contributed in various ways to make this year's SD&A
conference a very successful meeting:
- This year the SD&A conference was formally sponsored by MacNaughton Inc.
(Beaverton, Oregon), VRex Malaysia (Cyberjaya, Malaysia), and IMAX Corporation
(Mississaga, Canada). Conference sponsorship is a very valuable way for companies
to support the running of the conference and gain valuable marketing exposure.
We thank the sponsors for their support.
- The conference committee plays an important role throughout the year ensuring
the correct technical direction of the meeting. Sincere thanks go to Neil Dodgson,
Gregg Favalora, Janusz Konrad, Shojiro Nagata, Lew Stelmach (in memoriam), and
Vivian Walworth.
- The ability to present high-quality large-screen stereoscopic images and video at
the conference is an extremely important part of the conference. Many people and
companies contributed hardware, software, and expertise to make this a truly
impressive show. Particular thank-yous go to: Brad Nelson of Nelsonex (Los Gatos,
California), Chris Ward, Michal Husak, and Dan Lawrence of LightSpeed Design
Group (Bellevue, Washington), Steve Schklair and Bernie Butler-Smith of Cobalt
Entertainment (North Hollywood, California), Edwards Technologies, Inc. (ETI) (El
Segundo, California), Adrian Romero and the staff from Spectrum Audiovisual
(Denver, Colorado), Jason Goodman of 21st Century 3D (New York, New York), and
Tom Riederer from TreeD Films (Santa Barbara, California). Conference video
equipment included DVD player, 3D demultiplexer, two QD line doublers, 8 x 6-foot
stereoscopic rear projection screen, two Proxima Pro AV 9400 video projectors (all
provided by Nelsonex), DepthQ Stereoscopic Media Server computer and software
(LightSpeed Design Group), 14 x 8-foot stereoscopic front-silvered projection screen
(Cobalt Entertainment), two Panasonic PT-D7600 projectors (ETI), SVHS Player and
general AV equipment (Spectrum Audio Visual), dual industrial DVD players and
DVD playback synchronizer (21st Century 3D), and another DepthQ playback
system (TreeD Films).
- Thanks also to Takashi Sekitani (Tokyo, Japan), whose software "3D Slide Projector"
was used for digital stereoscopic still image slide presentation at the conference.
- A special thank-you also to those who helped make the 3D video screening session
run so smoothly.
- Thanks to the demonstration session presenters for making equipment available to
show to the conference attendees. Some equipment traveled from overseas,
making the contribution to the meeting particularly praiseworthy.
- I am sure the authors and attendees appreciated the diligence and hard work of
engineer Stephan Keith, who performed the role of AV monitor this year.
- Particular thanks are also due to the staff at SPIE and IS&T, who were instrumental in
helping organize the conference.
- Most importantly, we must thank the conference authors and attendees, who
ultimately made this meeting such a successful event.
This year two prizes were again offered as part of the SD&A conference. The prize for "the
best use of the available stereoscopic presentation tools during the conference technical
sessions" was won by Dr. Ezekiel Tan from Royal Newcastle Hospital (Australia) for his
presentation "Stereoscopy in orthopaedics." Dr. Tan's presentation was presented entirely in
stereoscopic 3D and was richly illustrated with many stereoscopic images. His presentation
also included a quick demonstration of a volumetric rendering program with an
orthopedic example (in stereoscopic 3D). Dr. Tan's prize was a copy of the book "3D
Australia" (by Ken Duncan and Leo Meier, ISBN: 0958054444) featuring stereoscopic
photographs taken all around Australia. A runner-up prize was awarded to Serdar Ince
from Boston University for his presentation "Recovery of a missing color component in stereo
images (or helping NASA find little green Martians)." Mr. Ince's presentation included a
series of full-color stereoscopic images reconstructed from images taken by the Mars
Exploration Rovers. Both prizes were provided courtesy of Ken Duncan Panographs
(Australia) <www.kenduncan.com>.
During the conference a common discussion point was Applications of Stereoscopic
Displays. For the 2006 conference we will therefore attempt to encourage more papers on
the topic of Applications of Stereoscopic Displays.
We lost another good friend, colleague, and expert in stereoscopic imaging this past year.
Dr Lew Stelmach, who had been a member of the program committee of the
Stereoscopic Displays and Applications conference since 2002. Lew died from cancer in
June 2004. Many conference attendees will have met Lew and know of his work at the
Communications Research Centre (CRC) in Canada. This was a hard blow for the SD&A
conference committee, with the death of conference co-chair Steve Benton only six
months earlier (also of cancer).
The conference activities don't stop at the end of the January meeting. The SD&A
conference website remains as a focus for conference activities during the time between
conferences. We will be seeking abstracts for the 2006 conference mid-year. You can join
a mailing list to receive conference announcements. The SD&A conference website
provides a focal point for many activities and provides a timetable for important meeting
deadlines. It also has a significant selection of photographs highlighting the activities of
past conferences. The website also hosts the stereoscopic virtual library, from which two
classic texts are available for free download: Herbert McKay's "Three Dimensional
Photography" and Lenny Lipton's "Foundations of the Stereoscopic Cinema". A third title will
also soon be available. Visit the conference website to gain an understanding of the past,
present, and future of stereoscopic imaging, and most of all think now about presenting a
paper or attending next year's conference. The Stereoscopic Displays and Applications
conference website is located at: www.stereoscopic.org
The Engineering Reality of Virtual Reality
This year's Engineering Reality of Virtual Reality conference began with a session chaired by
Mark Bolas on "Systems I". Jesse Eichenlub from Dimension Technologies spoke about a
passive method of eliminating accommodation/convergence mismatch in stereoscopic
head-mounted displays. The paper discussed out-of-focus and cross-eyed cues, perfect
subject matter for a session that began at 8:30 am. The Ecole Supérieure des Technologies
Industrielles Avancées sent Fabrice Depaulis to discuss work that fills the gap between
computer-based design and the physical assembly of real parts. The project is named
ESKUA, which means hand in the Basque language, and is based on using interactors and
a handling platform. The goal is not perfect prototyping, but to make the designer aware
of problems that might be present at each stage of assembly in a project. Emad Barsoum
presented work on creating WebVR: an interactive web browser for virtual environments. It
allows real-time web access from within a virtual environment and supports web browser
capabilities. This offers a number of advantages, including the ability to interact with such
information in a two-dimensional fashion without requiring the user to leave the immersive
environment. Dave Pape, a professor in media studies at SUNY Buffalo, presented his work
on answering the question: What's 'good enough?': some experiments with projected VR
quality. He presented a number of observations on his experiences configuring, using, and
showing virtual reality-based art on projection-based systems, and highlighted the
interesting trade-offs between stereoscopy, image size, and motion tracking. Closing the
session was Sung-Jin Kim, a graduate student at UC Irvine involved in the VIS group, and a
member of the DREAM lab. His paper reported on creating a real-time distributed display
system and discussed the challenges of creating a distributed system, including response
time, fairness, consistency and scalability.
Papers from the United Kingdom and Japan filled the late morning session on "Mixed
Reality" chaired by Mark Bolas. The Nara Institute of Science and Technology presented
three papers, continuing to present their ongoing efforts in this field. The first paper,
presented by Yoshihiro Yasumuro, was entitled "Projection-based augmented reality with
automated shape scanning". This paper highlighted project-specific results from an
ultrasound application. Grey-code stripes are projected in the infrared spectrum to
determine the surface geometry of a patient's body, upon which imagery is projected.
Continuing research in the use of infrared light was Yusuke Nakazato's presentation on
localization of wearable viewers using invisible retro-reflective markers and an IR camera.
Masayuki Kanbara presented the last of Nara's papers with presentation of a system that
created 360-degree environments from captured imagery that was summarized in the
paper "Three-dimensional reconstruction of outdoor environments from omnidirectional
range and color images". Presenting research from the other side of the globe was Paul
Kitchen from the University of Southhampton, who cleverly employs vision processing to
images of the palm. This system used the palm as a natural fiducial marker, thus providing
mapping information for augmented reality applications.
After lunch the session titled "Systems II," chaired by Ian McDowall, included three papers.
The first paper, "Large format 3D interaction table," was presented by Mr. J. Gustafson from
the Royal Institute of Technology in Sweden. This paper presented their work to create
stereo images through the use of reflection holograms. The system employs multiple
projectors that cast images onto the reflection hologram. The light reflected from the
hologram creates a stereo (or multi-view stereo) image for the viewer when the viewer is at
the correct viewing position. The holograms are largely transparent, so they may also be
used in an augmented reality configuration. The second paper in this session,
"Stereoscopic stimuli are not used in distance evaluation in multicue virtual environments,"
presented by Mr. A. Kemeny of Renault in Paris, explored the issues of perceived scale in
virtual environments. The research compared the perception of distances under several
conditions. These conditions included a large format stereo display, a head-mounted
display, and the real world. The research explored the relative influence on the perception
of distance of several factors, including field of view. The final paper in this session,
"ShadowLight: a flexible environment for multipurpose design and evaluation," presented
by Mr. K. Leetaru of the University of Illinois/ Urbana-Champaign, presented a software
system called ShadowLight. The paper presented several examples of using ShadowLight in
a CAVE-type environment. The system is based on a plug-in model where different plug-in
modules offer unique interaction and simulation capabilities.
The next session "Systems III," chaired by Ian McDowall, contained three papers. The first
paper, "Quantitative comparison of two stereoscopic 3D interaction methods," presented
by Mr. Zahir Alpaslan of the University of Southern California, explored people's perception
of different stereo display solutions. The alternatives compared the autostereoscopic Sharp
display to shutter glasses with a CRT display. The study's task was to re-orient a shape to
match a sample. The performance and people's impressions of the task were then
evaluated. The next paper, "Experiments in interactive panoramic cinema," was presented
by Prof. S. Anderson. The presentation outlined some of the historical efforts to recreate
panoramic experiences. These included the 1900 recreation of a balloon ride over Paris
that used 10 projectors imaging onto a spherical screen. Immersive experiences in a
historical context followed a progression of the Exotic, the Old, and the Violent. At the
conclusion of the presentation, two student-created panoramic experiences were
presented. The content was recorded using a multiple deck, multiple camera rig that
recorded the panoramas. The final paper of the session, "Import and visualization of
clinical medical imagery into multiuser VR environments," was presented by Mr. A. Mehrle
of the Johannes Kepler University Linz (Austria). The medical source data for the
visualizations presented were in the field of ENT (ear, nose, throat) and included
visualizations of CT data collected at 160-µm slice thickness. The data was remarkable in
that one could see the tiny bones of the inner ear. The segmented data was then
displayed at huge scale in a CAVE environment, where it could be explored by one
person.
The final session of The Engineering Reality of Virtual Reality conference was "Virtual Reality
Works: Demonstration and Panel Discussion," chaired by Margaret Dolinski, Indiana
University, and Daniel Sandin, University of Illinois/ Chicago. This session was in a completely
different format from the other presentations. Dan Sandin (University of Illinois at Chicago,
Chicago Illinois), Margaret Dolinski (Indiana University, Bloomington Indiana), Dave Pape
(University of Buffalo, Buffalo New York), and Daria Tsoupikova (University of Illinois at
Chicago, Chicago Illinois) brought a Linux-based VR system for people to experience at
the conference in San Jose. Also participating in the experience were Julieta C. Aguilera,
Helen-Nicole Kostis, and Josephine Anstey. They controlled avatars that also inhabited the
virtual world we experienced. Their presence (both as avatars and through voice) in the
environment was created over the network, and they assisted in guiding participants
through the various artworks being presented. The content included various artworks
created for these immersive environments. The individual pieces were accessed through
Confluxus by Todd Margolis, which created gateways or portals to the other worlds. The
avatars would gather in Confluxus and discuss where to go next. All the worlds were both
visual and auditory experiences. Beat Box by Margaret Dolinsky with Edward J. Dambik
offered the participants a world of various musical instruments. The lacey floors of the world
and ramps lead to various sound machines including a sequencer, and set of drums.
Rutopia by Daria Tsoupikova offers visitors a garden world composed of Russian folk
symbols. Geometric trees grow in response to the touch of the avatars and enable the
avatars to fly through the space. Kites Flying In and Out of Space by Jackie Matisse, Ray
Kass, Tom Coffin, Tom Johnson, and Dave Pape offers a unique way to experience the kites
created by Jackie Matisse. The kites can be picked up by an avater and flown through the
virtual space. The kites are modeled using finite element methods and flow smoothly as the
avatars fly through the space, letting the kites billow and flow behind them. Looking for
Water by Daniel Sandin, with sound by Laurie Spiegel (and thanks to Dick Ainsworth and
Tom DeFanti), presents an unlikely Martian landscape with several fountains of water. The
motion and modeling of the water occurs in real time and can be under the control of the
various avatars in the "shower room," where each avatar can pick up a water hose to
spray at the others. Animagina by Julieta Aguilera, Helen-Nicole Kostis, Tina Shah, Seung
Kang (with thanks to Alex Hill, Geoffrey Baum, and Damin Keenan) explores the nature of
symbols. Starting out with a yellow world peppered with classic amphora shaped vases,
the vases can be manipulated. One grows large and we enter to find pyramids, stairways
and an eye. Paapab by Josephine Anstey, Dave Pape, and Dan Neveu, with music by
Dan Neveu and additional modeling by Joseph Alexander, Sara Nohejl, Beth Cerny, and
Yalu Lin and software by Ygdrasil, creates a virtual dance floor environment. The dance
floor pulses to the beat and is populated with a number of life-sized animated characters.
The visiting avatars can go up the light shaft to control and record the dance motion for
one of the characters, which then flies off to join the dance below.
Conclusion
Next year the Stereoscopic Displays and Applications conference and The Engineering
Reality of Virtual Reality conference will be held 15 to 19 January 2006, at the San Jose
Convention Center, San Jose, California, as part of the 2006 IS&T/SPIE Electronic Imaging:
Science & Technology symposium. The 2006 conferences promise to continue a tradition of
presenting and demonstrating the latest technologies relevant to stereoscopic displays
and virtual reality. Please consider attending, presenting, or demonstrating at the 2006
Stereoscopic Displays and Applications conference and The Engineering Reality of Virtual
Reality conference. Photonics West will be held the following week, also at the San Jose
Convention Center (The Stereoscopic Displays and Applications conference and the
Engineering Reality of Virtual Reality conference are not to be part of Photonics West for
2006).
Andrew J. Woods
Mark T. Bolas
John O. Merritt
Ian E. McDowall
|