
From the glitz and glamour of KES 2010 at KINTEX in Ilsan in Gyeonggi Province, Korea IT Times traveled from north west to south east across the Seoul Metro area to the COEX in Gangnam, Seoul. On to the third and fourth floors of the exhibition complex and entered the inner sanctum of digital engineering and artist rendering, the ivory tower of "never before seen in this world" or in other words the 9th meeting of the International Symposium on Mixed and Augmented Reality (ISMAR) sponsored by the Institute of Electrical and Electronics Engineers, or IEEE. In this zone the natural laws of physics don't apply and engineering and artistic gods redefine our world.
Earlier this month the Korea IT Times had the pleasure of talking with Gerard Joung-hyun Kim, Professor, Korea University and Co-Chair of ISMAR 2010 (The full interview can be read on the www.koreaittimes.com web site). At that time we learned about the concepts and technology of Mixed and Augmented Reality.
What we learned during the conversation with Professor Kim is that, as opposed to Virtual Reality (VR), Mixed Reality (MR) (which includes both augmented reality and augmented virtuality) is when real and virtual environments are brought together to create new environments and visual scenes, where physical and digital objects are commingled and are able to engage with one another in the present time. One example is a heads-up display in a modern fighter jet.
Augmented Reality (AR) is a real-time view of a physical real-world environment and that scene is augmented by virtual computer-generated images. A simple example is when soccer fields show the logos their sponsor; giant brand images are displayed on the field and are only seen when the viewer is watching the event on TV. So, now that we've cleared that up, on with the show.
Can you imagine trying to create a 3D digital map of your neighborhood playground Try to define the specific GPS location and elevation of every feature, from trees to jungle gym to restroom building to statues and fountains. Then start on the texture of the grass, sand, water, stone and concrete. What is the weather and how bright is the sun shining
Once the engineer has digitally replicated the environment, often times with the help of an artist, they have to set the bounds of the artificial elements that will interact with the real environment. Augmenting reality occasionally means that what goes in to the real scene can not be tied to the physics of the real environment. Imagine a game of golf where Michelle Wie’s wayward drive, as opposed to bouncing off the trunk of a tree, the ball hits the tree and rolls up the trunk, down a branch and drops on to the green and rolls in to the cup for a hole in one. Creating their own versions of time, space, inertia and gravity are all a part of the program. This isn’t the way things work in the real world, but if you are designing a new game, then it may be the way you want them to work.
These scenarios represents a range of variables that are difficult to imagine, but some how, this is the challenge that these experts eagerly embrace in order to augment the reality that we know to improve the quality of our lives. This is the next frontier in the technical development of mankind and the ISMAR 2010 was the scene where the state of the art and science that will be required to take us to the next level was on display.
Visions of augmented reality to everyday life

Keynote speaker Henry Fuchs, the Federico Gil Professor of Computer Science and Adjunct Professor of Biomedical Engineering at the University of North Carolina Chapel Hill, who has been active in the development of computer graphics since the early 1970’s, said “the future is encouraging, as decreasing cost of the necessary components lowers the barriers to entry and encourages ever-increasing participation in innovation, development and use.” He continued by saying “Coupled with the rapidly advancing technologies in sensors, cameras, displays, robotics, and networks, this should enable us to accelerate bringing the visions of augmented reality to everyday life.”
In his speech, he touched on some amazing technologies, including tools used to enhance 3D scene acquisition by laser scanning with multiple acquisition cameras, augmenting a physician’s view of a patient with registered internal imagery and augmenting a user’s surroundings with projection on to multiple nearby surfaces. He also spoke of augmenting a user’s remote presence using a human-sized avatar that mimics their appearance, pose and gestures. Modeling for the purpose of research is a common technique used in developing this technology. Professor Fuchs also talked about augmenting tabletop displays with multi-user autostereoscopic capabilities, which refers to displaying stereoscopic images without the use of special headgear or glasses. He said that “The common goal of all of these systems is to enrich users’ immediate surroundings with computer generated or controlled enhancements, which can run the gamut from mere virtual imagery to full-fledged robotic androids.”
Collaboration and R&D is the key
Boris Debackere was also a keynote speaker at the symposium. Debackere is an artist and

New technologies will fuel a change in the way we experience moving images in to the future
Debackere pointed to the film industry as one case where this dynamic has been at work for more than a century, and how it has set an example for the media that has followed. He said “Nowadays, cinema is everywhere, especially outside the confines of the movie theater: it exists in all manner of altered forms and has become moreover an essential aspect of contemporary art.” He continued by saying that “Film fused the magical way of creating movement introduced by the optical illusion toys with the qualities of photography to capture and represent reality. This merger shifted the attention of watching movement depicted to an expressive way of creating illusions by framing the world, structuring time and linking one experience to the next with sensations of images and sounds in space and time.” He told the audience that new technologies will fuel a change in the way we experience moving images in to the future.
In-depth presentation in scope
Through out the symposium a number of papers were presented that described the efforts of the scientific and artistic communities to advance the enhancement of the real human experience through technology. Some with the purpose of improving the quality of our leisure time through game play, such as the research that is being done at the University of Illinois at Urbana-Champaign by the team of Brett R. Jones, Rajinder Sodhi, Roy H. Campbell, Guy Garnett and Brian P. Bailey. Their paper was entitled Build Your World and Play In It: Interacting with Surface Particles on Complex Objects. The paper was introduced during a talk given by Jones and Sohdi on Friday afternoon and to a full house of approximately 300 scientists and artists.

Their description and demonstration showed how a game could be played using virtual elements in a real environment. The team had users create a field of play out of real objects and superimposed virtual game pieces on the real field. The game pieces resembled real pieces, but they were not bound by real forces of nature, like gravity. The presentation showed the process by which the research team, using existing technology and limited financial resources both created the game and with in a relatively short period of time had users enjoying it. They developed their own methodology for mapping the surface and integrated that into the ways that the game pieces operated in the real world.
The symposium also offered workshops for the attendees to familiarize themselves with emerging

technologies. One such workshop was presented by the team of Young-Jun Kim and Sei Jang of Samsung Electronics, Korea. They introduced the attendees to a new mobile platform called ‘bada’, which means “ocean” in Korean.
Meet "bada" the ocean
As the smart phone, which has a camera, GPS, compass and internet as basic functions, is making a home for itself in out daily lives; mobile AR applications are becoming quite popular. Prior to reading this article and others like it, some smart phone users may not have been aware of or know what AR is, but they do want to have many of the features which have been made available by AR technology convergence.
Samsungs Sekai Camera and Layar open mobile user’s eye to new AR experiences. Sekai Camera allows people to place tags and photos in any location around them. Layar enables people to place digital information on the top of the real world as seen through the mobile phone’s camera. Today, mobile users are looking for more mobile AR applications beyond Sekai Camera and Layar.
The workshop introduced bada, the new mobile platform of Samsung and present key fundamentals of bada, as well as new opportunities for mobile application developers. They also discussed the current status and the future direction for mobile AR in bada.

There were also many posters that depicted technologies used in various facets of AR development, as well as presentations by exhibitors showing the results of the practical application of those technologies.
ISMAR 2010 was not only attended by the technical and artistic elite, but also by the people who will both benefit from and lead our world into a distant time in the development of AR, the students. Those who qualified were admitted at a reduced rate and provided a rare opportunity to join seasoned professionals as they took stock of their field and discussed ways that it will be advanced in the near and long term. The experience will no doubt have a lasting impression on those who were fortunate enough to participate.
For ISMAR 2010 the theme was “Boarderless” and the AMH had an exhibition, which was co-hosted by the Art Center Nabi, ISMAR 2010 and V2_and held at the Art Center Nabi called “Boarderless Reality”.
Great Exploration
The exhibition allowed visitors to explore a borderless world that was developed from the point where physical reality and virtual layer become entangled in augmented and mixed reality technologies. The exhibition was comprised of seven works: three entries titled ‘Augmented Shadow’, ‘Frame Seductions’ and ‘Fecundation’, which were selected from the art gallery of the 9th ISMAR 2010, and three works: ‘Mirror Scrutinizer’, ‘RE:’ and ‘Serendipitor’ which are receiving support or being developed by V2_ and, lastly a classical piece that was selected by Nabi called ‘Telematic Dreaming’ by Paul Sermon, which adopted the augmented realty concept in the early ‘90s.
Banquet and awards ceremony
The ISMAR 2010 banquet and awards ceremony was held at the Ramada Seoul Hotel on Friday evening. At the event six awards were presented. The Best Paper Award, which honors the authors of a research paper of exceptional merit; The Best Student Paper which is presented to the author of the best paper written solely or primarily by a student; two runner-up awards, one from a technical paper and one from the arts / media / humanities papers, are presented to the individual or group who have written and excellent paper. There were also awards for demonstration and poster presenters. The Prizes for the Best Paper Award was a smart phone and $100 USD, a smart phone for the Best Student paper and runners up and $100 USD for the demonstrators and poster award winners. At the time of this writing the winners had yet to be announced, therefore were not included in the story. When this information becomes available the story will be updated.
The Co-Chairs of ISMAR 2010, Ko Hee-dong of the Korea Institute of Science and Technology and Gerry Joung-hyun Kim, Professor at the College of Information and Communications at Korea University and the rest of the organizing committee had a tall order in putting together this important event and they seem to have done a wonderful job of augmenting the reality of place, people and papers with the technology to convey a look at the present and a glimpse of the future.