A white car with Augmented Reality projection at UniSA's Mawson Institute's laboratory.
Imagine you are walking into an automobile sales room and you find the car of your dreams but you have trouble choosing the correct colour or design. No worries! With the flip of a switch, the car changes colour with Augmented Reality technology, allowing you to see prototypes of your options before making any costly decisions. (Ambiguous statement. AR doesn’t actually change car but you can see a prototype)
The Wearable Computer Lab at the University of South Australia is doing just this. The science happens at the $3,000,000 laboratory dubbed the UniSA Holodeck, after Star Trek’s famous holographic room, and is a joint research facility between the Advanced Computing Research Centre and the Future Industries Institute. We are taking everyday objects and digitally draping them with projector-based Augmented Reality to enhance their appearance. The Wearable Computer Lab with Jumbo Vision International have developed CADwalk TM, a full scale environment that employs 2D, 3D and 3D Stereoscopic vision, multiple projection and 3D motion capture cameras to overcomes the customary challenges practiced in Command and Control Centre room design, by revolutionising a practice that is paper based and making it life-sized and interactive.
What is Augmented Reality?
Objects without and then with Augmented Reality projection. Source: Bruce Thomas
Augmented Reality (AR) is the registration of computer-generated graphical information over a user’s view of the physical world. The supplement of information that is spatially located relative to the user can help to enhance their understanding of the world around them. AR is made up of a combination of virtual and real environments, although the exact make up of this may vary significantly. Milgram and Kishino used these properties to define a reality-virtuality continuum, and this can be used to perform comparisons between various forms of Mixed Reality (MR) by placing them onto a spectrum, as explained in the diagram below.
At one end of the continuum is the physical world, the other end is fully synthetic virtual environments (VR), and AR is located somewhere in between since it is a combination of the two. On the far left of the continuum is the physical world with no virtual information at all. Moving to the right in the continuum is AR, where artificial objects are added to the physical world. More to right in the continuum, but not all the way to the right side, is augmented virtuality where physical world objects, such as a live display of a remote view, are added into a fully immersive virtual environment. On the far right side is a completely synthetic environment, with no information from the physical world being presented. Every type of 3D environment can be placed somewhere along this spectrum and can be used to easily compare and contrast their properties. Mixed Reality is defined as the combination of Augmented Reality and Augmented Virtuality.
Ron Azuma defines AR systems as those that contain the following three characteristics:
1. combines real and virtual,
2. interactive in real-time, and
3. registered in 3D.
This definition covers AR for projectors (as seen in the first picture above), head mounted displays, and smart phones, but excludes non-interactive media such as movies and television shows.
Consumer-Level Augmented Reality Technology
The Epson Moverio BT-200 Smart Glasses. Source: Gizmodo
Epson Moverio BT-200 is the next generation of AR Head Mounted Displays. They really look pretty groovy. Different from projectors, head mounted displays allow people to walk anywhere and view AR information.
To overlay 3D virtual models onto the user’s view, an AR system requires some form of displayed computer-generated information to be combined with a sensor that can measure the position and orientation of the user’s view. As the user moves through the physical world the display is updated by the computer in real-time. The accuracy of the virtual objects registered onto the physical world influences the realism of the fusion that the user experiences.
Smartphones are a great platform that combines all the required technology to support AR. The graphics, processors, and sensors are all high quality and allow for a compelling AR experience. The great thing about smartphones is that there are so many of them. AR can be experienced by anyone with such a device and there are many applications available to the public at no charge.
Companies like Layar are also producing navigation tools for people to find all sorts of useful information right around them, such as finding the closest ATM or interesting events and attractions nearby.
Layar Augmented Reality Navigation Platform. Source: Layar
UniSA’s Augmented Reality Research
The Wearable Computer Lab is investigating human computer interaction research for all forms of visual augmented reality.
One area of investigation is the support of remote workers using AR. An expert away from the work site is able to “paint” AR instructions onto an object to instruct a worker on how to cope with a problem. These collaboration tools are of particular interest to our mining companies.
Another prominent area that our current research direction is taking is investigating AR tools to support the design of new products. The construction of high quality visual mock-ups can be very expensive for companies. Our projector-based AR technology allows clients and designers to cost effectively construct early mock ups with simple physical objects detailed with projected AR graphics.
One such system is CADwalk TM that our team at the Wearable Computer Lab played an important role in developing with Jumbo Vision International. The ground-breaking workspace design system allows users to design and view potential Command and Control Centre work spaces in virtual reality.
The system, which was partially funded through the University’s commercialisation arm UniSA Ventures, recently won a South Australian iAward, an event which recognises ICT innovations across Australia.
Once limited to science fiction movies, CADwalk TM enables clients to plot walls or furniture from life-sized building design plans using high-tech 2D and 3D visualisations at a 1:1 scale. Users, such as architects, engineers, builders and operators, can move around life-size versions of concepts and modify room layouts in real-time using a special camera and movable pole receptor system.
CADwalk TM has already been utilised in a live setting by New Zealand’s national electricity power provider, Transpower. However, the system has the potential to be used in a wide range of other industries such as home design, hospitals, shipbuilding and construction.
We are delighted to be part of a collaborative team working on cutting-edge technology that has the potential to revolutionise the way design projects happen. It is great to see the fruits of our research making it into the real world.
The CADwalk TM recently won a South Australian iAward. Source: Jumbo Vision International