Stereoscopic Displays and Applications XXVII (2016)
Proceedings of the Electronic Imaging Symposium
The twenty-seventh annual Stereoscopic Displays and Applications (SD&A) conference was held in downtown San Francisco at the Hilton Union Square hotel. The SD&A conference remains the premier venue for the dissemination of research on topics relating to stereoscopic displays and their applications.
SD&A attracts key players in the field: stereoscopic experts from industry and academia presented the two keynotes, participated on the discussion panel, and spoke in the technical presentations. The conference had an excellent technical program covering a wide range of stereoscopic topics.
This year the conference received 51 submissions. Of these, 35 were accepted for oral presentation (68%), with an additional seven interactive papers accepted as posters. Two joint sessions were also held with co-located conferences - the Engineering Reality of Virtual Reality 2016 conference and the 3D Image Processing, Measurement (3DIPM), and Applications 2016 conference.
This conference proceedings volume contains the technical papers in support of most oral presentations and posters given at the conference. This year a new oral presentation only category was included for industry authors which do not include a manuscript. Additionally, one of the conference manuscripts has been published in the IS&T Journal of Imaging Science and Technology (JIST). The conference program listing at www.stereoscopic.org/2016 indicates which presentations were presentation only and which have a supporting manuscript and where to find it. All manuscripts from the SD&A 2016 conference are open access - happy reading!
This year's SD&A conference took place 15-17 February 2016 as part of the 28th annual IS&T International Symposium on Electronic Imaging: Science and Technology, at the Hilton San Francisco Union Square Hotel, in downtown San Francisco.
This introduction gives an overview of the conference as a reminder for those who attended and an insight into what happened for those who were unable to attend.
The first day had three technical sessions. Topics covered included Light-Field and Super-Multiview Displays, 360° 3D and 3D Content. The 360° 3D session was a new theme which included papers on walk-around 3D displays, walk-in 3D displays which wrap around the user, and techniques for capturing 360° 3D content that could be displayed on a head-mounted display in 3D. The day also included the first of three Electronic Imaging (EI) Symposium Plenary presentations, the first of the two SD&A keynotes, the Electronic Imaging symposium reception, the 90 minute 3D Theatre Session, and if that wasn't enough there was also the annual SD&A conference banquet.
The first Keynote Presentation was given by SD&A conference co-chair Andrew Woods from Curtin University (Australia) and was entitled "Two shipwrecks, 2500 metres underwater, six 3D cameras - let the survey begin". The presentation summarised an expedition led by Curtin University, WA Museum and DOF Subsea to conduct a 3D imaging survey of the two historic shipwrecks off the Australian coast - the HMAS Sydney (II) and HSK Kormoran. The presentation provided an overview of the expedition, a summary of the technology deployed, and an insight into the 3D imaging materials captured. A custom camera and lighting package was developed for the expedition which included six 3D cameras and fourteen digital still cameras fitted across the two ROVs for the purposes of capturing feature photography, cinematography and 3D reconstruction photography. The six underwater stereoscopic cameras (three on each ROV) captured a mix of 3D HD video footage, 3D stills, and 3D 4K video footage. A 3D teaser documentary was screened to illustrate the footage captured during the expedition. It was clear why Andrew had missed the past two conferences working on this project.
The Monday EI Plenary was presented by Audrey (Ellerbee) Bowden from Stanford University (USA) on "Novel Tools for Optical Imaging and Sensing at the Microscale and Nanoscale". Her presentation included a discussion of Optical Coherence Tomography (OCT) which is a non-invasive imaging technique that uses optical interference to take cross-sectional images of a broad range of materials such as human skin or even the retina. The captured 3D data could potentially be presented stereoscopically. The three EI plenary presentations were held in the combined SD&A and HVEI (Human Vision and Electronic Imaging) conference halls- it was great to see the hotel and AV staff working so efficiently to open and close the air-wall and reconfigure the AV between the two rooms.
The two-hour 3D Theater Session is a regular highlight of the conference that showcases 3D content from around the world. This year, the following thirty pieces (or segments thereof) were screened:
- "Amour Fou 3D" - Florian Werzinski (Germany)
- "Happy When It Rains" - Karel Bata (UK)
- "zeitRaum II" - Volker Kuchelmeister (Australia)
- "We Built a Ship" - Stefan Sargent (USA)
- "Chlamydomonas reinhardtii 3D - From Biological Cells to Biofuels" - Niklas Biere, Björn Sommer (Germany / Australia)
- "Inside the Dome" - Stuart Bender and Mick Broderick (Australia / Japan)
- "Safety Geeks SVI" - Roger Tonry and Tom Konkle (USA)
- "Riga - 2041" - Karlis Vitols, Triin Ruumet, Adina Istrate, Didzis Eglitis (Latvia)
- "The Simple Carnival - The Problem with Friends" - Jeff Boller (USA)
- "Transference" - Sean Arden (Canada)
- "The End of the Dark Ages" - Ralf Kaehler (KIPAC/SLAC), Marcelo Alvarez (CITA), Tom Abel (KIPAC/SLAC) (USA)
- "Adidas Originals 'Bushwick'" - Ben Schwartz (USA)
- "Magic Field 3D" - Masuji Suto (Japan)
- "DNA of Angel" - Aleksey Osipenkov (Russia)
- "Towards a Six-Dimensional Cinema" - Peter Rose (USA)
- "Valor Cat" - Ben Reicher (USA)
- "Crime Squad 3D (Episode 6: Interview3D)" - Enhanced Dimensions (UK)
- "Educational 3D Content: Onnabori" - Shibata lab, Tokyo University of Social Welfare (Japan)
- "Grami's Circus Show - Season 1" - Studio Gale Co, Ltd. and KBS Media (South Korea)
- "Pocket Universes Macro Shoot" - Eric Deren, Dzignlight Studios (USA)
- "City Kay Live in Printemps de Bourges" - Fabien Remblier (France)
- "Amongst" - Chisa Hidaka and Benjamin Harley (USA)
- "Carta De La Muerte A Frida (A Letter For Frida From The Death)" - Ana Leticia Reyes and Diego Sandoval (Mexico)
- "Every Two Minutes" - Catriona Baker and Curvin Huber (USA)
- "Geopark 3D Teaser" - Helio Augusto Godoy de Souza (Brazil)
- "Inside Out" - Pixar Animation Studios (USA)
- "CODA" - Denis Poulin and Martine Époque (Canada)
- "Big Hero 6" - Walt Disney Studios (USA)
- "Aliens Dancing Sirtaki" - San Base (Canada)
- "Lava" - Pixar Animation Studios (USA)
All entries were screened in high-quality polarized 3D on the conference's large projection screen.
The Best of Show awards were judged by Eric Kurland (3-D Space), Shyam Kannapurakkaran (stereoscopic artist and president of LA 3-D Club), and Dan Sandin (University of Illinois at Chicago). Content contributors self-selected if they wished their entry to be included in the competition or demonstration category.
The judges chose the following 3D content winners:
Best of Show Live Action category:
"Amongst" by Chisa Hidaka and Benjamin Harley (USA)
Synopsis: Lose yourself in the radically different world of wild dolphins as you follow graceful dancers into the deep ocean, surrounded by swirling, chattering, and squawking, at once familiar and unfamiliar.
Best of Show Animation category:
"Chlamydomonas reinhardtii 3D - From Biological Cells to Biofuels"
by Bjorn Sommer and Niklas Biere (Australia / Germany).
Synopsis: Chlamydomonas reinhardtii is a green algae which is often used in biotechnology as a model organism. This single-cell organism has a size of ~10 micron, contains a very large chloroplast relevant for the energy production which is partly used to move by using two flagella. Recently, it moved into the focus of biotechnological research lead by the idea to produce biological fuels. This stereoscopic 3D animation illustrates and breaks down the complex intracellular relationships and processes involved in this process.
John Stern joked that "Chlamydomonas reinhardtii 3D - From Biological Cells to Biofuels" was the Best of Show with the worst title.
The producers of the 2016 SD&A 3D Theater were: John Stern (Intuitive Surgical Inc., retired), Chris Ward (Lightspeed Design), and Andrew Woods (Curtin University). Management and playback of 3D content was expertly handled by Dan Lawrence (Lightspeed Design). The 3D content partner for the session was the LA 3-D Movie Festival (USA).
The evening concluded with the SD&A conference dinner at M. Y. China in the Westfield San Francisco Centre, near the conference hotel. The "M. Y." in the restaurant name refers to chef Martin Yan who is well known for his cooking TV show "Yan Can Cook." The food and company were delightful, and Martin Yan himself visited our group several times during the evening to keep us entertained - even including a live noodle making demonstration to the tune of "Gangnam Style".
The second day of the conference had three technical sessions on Human Factors and 2D to 3D Conversion, 3D Image Quality and Visual Comfort, and Autostereoscopic Displays. The day also included the second SD&A keynote and the demonstration session.
The second keynote presentation was presented by Greg Kintz on behalf of himself and Bob Furmanek from the 3-D Film Archive (USA). The presentation summarized the archive's activities in saving, restoring and releasing historical 3D movies. The audience were treated to some high-quality 3D footage from recent historical 3D restorations including Kiss Me Kate, Gog and New Dimensions (aka Motor Rhythm) - a stop motion animation of the construction of a Plymouth automobile.
The final event of the day was the ever-popular Demonstration Session, which has run every year since 1990. Since 2006, this has been a symposium-wide event, open to demonstrators from all of the Electronic Imaging conferences. It was pleasing to see a wide range of demonstrations and to see a large audience actively engaging with the various displays and vendors.
Demonstrations relevant to SD&A topics included:
- Bjorn Sommer from Monash Univ. (Australia) and CELLmicrocosmos (Germany) used a zSpace 3D monitor to illustrate the 3D cell visualisation software that he and colleagues have developed and shown in the CAVE2 room-size 3D visualisation system at Monash University.
- David Fattal from LEIA 3D (USA) demonstrated their full parallax, 64-view, diffractive 3D display. LEIA 3D's technology, the subject of much interest in the display community since a 2013 Nature piece about its roots at H-P Labs, combines a custom waveguide and a backlit LCD display to produce multi-view 3D imagery in color. Similar in some ways to integral photography displays, the waveguide's array of diffractive patches directs the light from collections of pixels to various viewing zones.
- Shyam Kannapurakkaran and Barry Rothstein from the LA 3-D Club (USA) demonstrated "Sandbox" - a small-table size standalone 3D viewer for pop-up (phantogram) stereoscopic photographs, video and interactive media. The imaging modality and the device are designed to stimulate play, creativity and learning.
- Tim Macmillan and his team from GoPro (USA) demonstrated the GoPro 360¡ 3D camera array which uses 16 GoPro cameras and captures content which can be uploaded to the new YouTube 360¡ 3D channel.
- Gordon Wetzstein and his colleagues from Stanford University and NVIDIA showed their light field stereoscope demonstration. The head-mount prototype uses a two layer display design to implement focus cues for the viewer.
- Margaret Dolinsky and Chauncey Frend from Indiana University showed a hand-crafted virtual environment called "Figuratively Speaking" which is an immersive, interactive virtual reality art environment based on a series of paintings featuring figures that appear predominately as faces. The hardware platform to showcase the virtual environment was an Oculus Rift DK2 plus fans and heaters to diversify the experience.
A good number of demonstrations from authors presenting at other Electronic Imaging conferences were also on display. A selection of photographs from the demonstration session will be available via the SD&A conference website www.stereoscopic.org as soon as we've caught our breath.
The second EI Plenary was presented by Ren Ng from University of California, Berkeley (USA) (plus founder, executive Chairman and former CEO of Lytro) on "Pushing computational photography deeper into imaging system design." His presentation reviewed the development, promise and future of computational photography. Of course, computational photography also has relevance to stereoscopic imaging. This presentation was a great continuation of the theme from the presentation the previous day by Tim Milliron from Lytro (USA) on "Capturing and Rendering Light-Field Video: Approaches and Challenges" where he discussed their new 360¡ 3D camera called the Lytro Immerge which contains 60 to 200 cameras at 2k x 2k resolution producing 94 GB/s of data (5.7 TB per minute of footage).
The third day of the SD&A conference had the popular discussion forum and three technical sessions on 3D Content, Stereoscopic Image Processing and Depth Mapping, and Virtual Reality and 3D.
The discussion forum considered 3D in VR and AR: Application Challenges. The panel comprised moderator Carolina Cruz-Neira (Emerging Analytics Center, University of Arkansas at Little Rock), Devon Copley (Nokia) and Marty Banks (University of California Berkeley). The panel considered the stereography required for VR and AR and the additional factors that developers must consider to deliver a comfortable and compelling experience.
The third and final EI Plenary was presented by Achin Bhowmik by Intel Corporation (USA) on "Intel RealSense Technology: Adding human-like sensing and interactions to computing devices." The presentation discussed the Intel RealSense Technology, which is enabling a new class of interactive and immersive applications based on embedded real-time 3D visual sensing; spanning from PCs, to mobile computing devices, to intelligent autonomous machines, robotics and internet-of-things, blurring the border between the real and the virtual worlds.
The day concluded with the interactive paper / poster session. There was lots of energy in the hall with presenting authors standing with their posters and kept busy with questions from the attendees.
Video recording was made of most technical sessions in the SD&A conference hall including the two keynotes. Editing is underway and the content will be available online via the SD&A conference website.
In addition to the prizes for the 3D Theater, a final prize was offered at the conference for the best use of stereoscopic presentation tools during the technical presentations. The winner was chosen by the SD&A conference chairs.
The winner for the best use of the stereoscopic projection tools during the SD&A conference presentations was
"An efficient approach to playback of stereoscopic videos using a wide field-of-view"
Chris Larkee and John LaDisa (Marquette University, USA).
The prizes this year were copies of the Blu-ray 3D disc "3-D Rarities". Congratulations to all our prize-winners.
Many individuals and companies contributed in various ways to the success of this year's SD&A conference:
- We appreciate the support of this year's stereoscopic projection sponsors: DepthQ Stereoscopic (USA), Christie Digital (USA), and Tekamaki (USA). The ability to present high-quality large-screen stereoscopic images and video at the conference is vital to the success of the conference.
- This year we had a Christie Digital Mirage HD10K-M projector (1920 x 1080 resolution, 16:9 aspect ratio, 3 chip DLP, 10,000 ANSI lumens, provided by Christie Digital) projecting onto a 4.9 x 2.7 meter silvered screen (provided by STRONG / MDI Screen Systems), outputting frame-sequential circularly-polarized 3D (at 120Hz) by way of a DepthQ active polarization modulator (provided by Lightspeed Design). The system was driven by a DepthQ stereoscopic media server for playback of all of the stereoscopic video content shown during the 3D Theater.
- Many thanks to the individuals who worked on-site: Adrian Romero and staff from Spectrum Audio Visual; Chris Ward and Dan Lawrence from Lightspeed Design. The AV setup was coordinated by Diana Gonzalez from IS&T, Adrian Romero from Spectrum AV, and Andrew Woods from Curtin University.
- We very much appreciate the dedicated support of Stephan R. Keith (SRK Graphics Research), who again had a multi-tasked role at this conference, including supporting the needs of all of our presenters.
- Jessica Davis Brome provided additional support by tracking author video recording permissions.
- We are grateful to all of the providers of 3D content for allowing their content to be shown to the conference audience at the 3D Theater Session.
- Thanks to the demonstration session presenters for bringing equipment to show - especially to the presenters who brought equipment from overseas.
- The conference committee plays an important role throughout the year, ensuring the correct technical direction of the meeting. Sincere thanks go to our founding chair, John Merritt, and our committee: Neil Dodgson, Davide Gadia, Hideki Kakeya, Stephan Keith, Michael Klug, John Stern, Chris Ward, and Michael Weissman. This year we welcomed two new committee members to the team - Stephan Keith and Michael Klug.
- Thanks also to the staff at IS&T - the organizing society instrumental in organizing all manner of aspects for the meeting.
- Most importantly, we thank the conference authors and attendees, who ultimately made this meeting such a successful event. Thanks especially to those who travel a long way to join us each year.
We were very pleased to see four important members of the SD&A conference community rewarded for their continued long-term service to the conference at this year's event. Stephan R. Keith, Chris Ward, Dan Lawrence and John Stern each received an IS&T Service Award during the Tuesday EI Plenary session - in particular for Stephan's long term volunteer role supporting author AV at the conference, and Chris, Dan and John's hard work on the hugely popular SD&A 3D Theater session.
There are three technical aspects that the SD&A conference chairs observed during this meeting and would like to provide specific commentary on as they also relate to the wider stereoscopic imaging technical community.
When is vergence-accommodation mismatch a problem? The topic of vergence-accommodation mismatch was cited in many presentations and demonstrations at the conference. In some viewing situations there is a significant mismatch between the focal plane of the image and the perceived depth location of the image, thus creating a mismatch for the viewer between the accommodative demand to focus the image and the vergence demand to bring the focal point in the image into clear fused vision. This particularly affects desktop and handheld displays (at short viewing distances) where the depth of field of the eye is shallow and the stereoscopic depth presented can easily exceed this range. There is evidence this affects the quality of the viewing experience when the conflict is significant. However, it is important to understand that the longer viewing distance of large-screen TV and cinema displays mitigates this problem when the display is effectively at, or close to, optical infinity and there is no longer an accommodative demand on the eye for all practical values of stereoscopic depth. Recent advances in VR displays raise this issue in a new way, as they can practically show a wide range of depth from near vision to optical infinity. The question for VR display designers is where to place the focal distance of the image: optical infinity seems a good compromise in gaming, but it might mean that close interaction work would be difficult. Two avenues being explored to address this are the use of adaptive optics and lightfield optics to present a more natural focal experience to viewers. However the series of optical design trade-offs in VR displays depends very much on the application, does it have near field and/or far field depth requirements, and what is the required image quality - since there is currently a quality trade-off in adding adaptive or lightfield optics to VR displays. This area continues to one of active research and discussion and will likely be the topic of many future presentations at SD&A. Among the many themes to be fully investigated is the effect of ageing and the reduction in accommodative power this brings for older displays users - like your esteemed conference chairs!
When is a display technology a hologram? Holograms are close to the heart of many SD&A conference attendees and indeed former SD&A conference chair (the late) Stephan Benton was the developer of the rainbow hologram - now used on almost every major credit card offered by banks around the world. Steve also co-chaired the Practical Holography conference for many years. Traditionally, a display has been deemed holographic if it relies principally on diffraction to reconstruct the light field of an image. However, in recent years, the terms 'hologram' and 'holographic' have started to be used widely in the lay media to describe any technology which produces a 2D or 3D image that appears to float in space. Displays which employ techniques such as Pepper's ghost, augmented reality displays, or integral imaging displays have all been described variously as holographic. The term 'Holodeck' was introduced in the TV show "Star Trek: The Next Generation" in its debut in 1987 - to describe a full size room based simulated reality facility using an undescribed future technology set in the year 2364. More recently materials describing the Microsoft HoloLens augmented reality headset commonly use the terms hologram and holographic to describe the visual results of that display. These uses are all a delightful homage to the original Hologram which was first described by Dennis Gabor in a 1948 issue of Nature ("A new microscopic principle", Nature, 161, p 777 - 778), but are not technically correct. Web pages are cropping up clarifying which is and what isn't a traditional hologram, however language is an evolving thing - open to change to cope with developments in the world around us. Are we seeing a new linguistic paradigm developing to describe the new range of display technologies becoming available around us? Will members of the SD&A community contribute to these developments in technology and language? Undoubtedly, the answer is yes!
The "3D is Dead" cliché: It seems that almost every month that we see another article in the media proclaiming the death of 3D. Journalism and media coverage are a fickle thing - new fields are heavily promoted and lauded at one point and all too ready to be torn down soon after. We are seeing VR and AR (Virtual Reality and Augmented Reality) gaining considerable media attention at the moment in anticipation of this year's big VR and AR product releases: Facebook's Oculus Rift, Samsung's Gear VR, Sony's PlayStation VR, Microsoft's Hololens and teaser announcements from the mysterious Magic Leap. In February 2016, John Riccitiello, the CEO of Unity Technologies, developer of virtual environment software widely used in the VR field, spoke of the 'Gap of Expectation' and the danger that the expectations for VR technologies will be built up so big that when those goals are not immediately met, the technologies will be announced a failure ( https://youtu.be/ThpvQ9AwzrI?t=12m10s ). He cautiously predicted that development speed and uptake will be slower than many are forecasting but nevertheless the field is going to produce amazing change. Stereoscopic 3D technologies such as 3D movies, 3D TVs, Blu-ray 3D and 3D games have experienced a massive revolution in the past 10 years. RealD rolled out their first 85 digital 3D theaters for the Disney movie Chicken Little exactly ten years ago last November. In 2009, Avatar was the big watershed moment for 3D movies - it was the biggest movie of all time. In 2010, almost every major TV manufacturer showed off brand-new 3D HDTV models at the Consumer Electronics Show in Las Vegas. The first discs of the new Blu-ray 3D format were released that same year. In 2011, Nintendo released the 3DS handheld game console with an autostereoscopic display which has gone on to sell over 50 million units worldwide, and four different major product variants have been released. As of 2014, there were an estimated 65,000 digital 3D screens worldwide, comprising 51% of all digital cinema systems installed (Motion Picture Association of America (MPAA), "Theatrical Market Statistics 2014", March 2015). In 2015, seven of the Top 10 highest grossing movies in the USA were 3D movies ( http://www.boxofficemojo.com/yearly/chart/?yr=2015 ). At the 2016 Academy Awards, seven Oscars went to 3D movies. Some sectors of the 3D market have seen decline in recent years from their massive peaks in the 2010-2012 period, and the initial massive growth rate has slowed in some areas. The percentage of new TVs in the consumer market offering 3D support has reduced, and in 2016 some TV manufacturers have dropped 3D support all together. But let's remember that stereoscopic 3D is a key component of VR and AR - long live 3D!
Conference activities do not stop at the end of the annual meeting. The SD&A conference website and LinkedIn group provide a focus for conference activities during the time between conferences. We will soon be actively seeking abstracts for the 2017 conference, with a deadline in mid-2016 - see the SD&A website for details and deadlines. You can join the SD&A LinkedIn group to receive conference announcements. The website has an extensive collection of photographs highlighting the activities of past conferences. In addition, the website hosts the stereoscopic virtual library, which contains several historically important books that have been digitized, in full, into PDF format, and are available for free download.
The SD&A conference runs an active LinkedIn group which is available at:
Linkedin has recently been reducing its email notification options so if you're not a regular user of Linkedin and you would like to be kept up-to-date with SD&A conference activities via email, it will probably be better for you to sign-up to our conference mailing list. Visit here to sign up:
A number of conference attendees were live-tweeting at the event - most using the hashtag #SDAconf. You can see some of the chatter, including many images, in a summary document available here <https://doi.org/10.2352/ISSN.2470-1173.2016.5.SDA-1>, or by visiting this link: https://twitter.com/search?f=tweets&vertical=default&q=%23SDAconf&src=typd
Some of the tweets caught the lighter side of the conference including what might be a new tradition for the conference - the Tim Tam Slam. Look it up if you haven't heard of it!
You can visit the conference website to gain an understanding of the past, present, and future of stereoscopic imaging. Please think now about submitting a paper or attending next year's conference. The Stereoscopic Displays and Applications conference website is at:
Next year, the 28th annual SD&A conference will be held during the period 29 January to 2 February 2017, at the Hyatt Regency San Francisco Airport hotel in Burlingame - within sight of the SFO airport. The hotel provides super convenient access from the airport with a free regular shuttle. This is the same venue where the conference was held 2011 to 2013 except that the hotel has recently embarked on a multimillion-dollar renovation. The open internal atrium of the hotel is a picturesque aspect of the venue.
The 2017 SD&A conference will continue a tradition of presenting and demonstrating the latest technologies relevant to stereoscopic displays and applications. Please consider attending, presenting, or demonstrating at the 2017 event. We hope to see you there!
Andrew J. Woods
Nicolas S. Holliman
Gregg E. Favalora