Walking around the streets searching for a place to eat will be no hassle when a head-mounted display (HMD) becomes affordable and ubiquitous. Researchers at the Korea Advanced Institute of Science and Technology (KAIST) developed K-Glass, a wearable, hands-free HMD that enables users to find restaurants while checking out their menus.
K-Glass Developed by KAIST
Copyright : KAIST
If the user of K-Glass walks up to a restaurant and looks at the name of the restaurant, today’s menu and a 3D image of food pop up. The Glass can even show the number of tables available inside the restaurant. K-Glass makes this possible because of its built-in augmented reality (AR) processor.
Unlike virtual reality which replaces the real world with a computer-simulated environment, AR incorporates digital data generated by the computer into the reality of a user. With the computer-made sensory inputs such as sound, video, graphics or GPS data, the user’s real and physical world becomes live and interactive. Augmentation takes place in real-time and in semantic context with surrounding environments, such as a menu list overlain on the signboard of a restaurant when the user passes by it, not an airplane flight schedule, which is irrelevant information, displayed.
Most commonly, location-based or computer-vision services are used in order to generate AR effects. Location-based services activate motion sensors to identify the user’s surroundings, whereas computer-vision uses algorithms such as facial, pattern, and optical character recognition, or object and motion tracking to distinguish images and objects. Many of the current HMDs deliver augmented reality experiences employing location-based services by scanning the markers or bar-codes printed on the back of objects. The AR system tracks the codes or markers to identify objects and then align them with virtual reality. However, this AR algorithm is difficult to use for the objects or spaces which do not have bar-codes, QR codes, or markers, particularly those in outdoor environments and thus cannot be recognized.
To solve this problem, Hoi-Jun Yoo, Professor of Electrical Engineering at KAIST and his team developed, for the first time in the world, an AR chip that works just like human vision. This processor is based on the Visual Attention Model (VAM) that duplicates the ability of human brain to process visual data. VAM, almost unconsciously or automatically, disentangles the most salient and relevant information about the environment in which human vision operates, thereby eliminating unnecessary data unless they must be processed. In return, the processor can dramatically speed up the computation of complex AR algorithms.
The AR processor has a data processing network similar to that of a human brain’s central nervous system. When the human brain perceives visual data, different sets of neurons, all connected, work concurrently on each fragment of a decision-making process; one group’s work is relayed to other group of neurons for the next round of the process, which continues until a set of decider neurons determines the character of the data. Likewise, the artificial neural network allows parallel data processing, alleviating data congestion and reducing power consumption significantly.
KAIST’s AR processor, which is produced using the 65 nm (nanometers) manufacturing process with the area of 32 mm2, delivers 1.22 TOPS (tera-operations per second) peak performance when running at 250 MHz and consumes 778 miliWatts on a 1.2V power supply. The ultra-low power processor shows 1.57 TOPS/W high efficiency rate of energy consumption under the real-time operation of 30fps/720p video camera, a 76% improvement in power conservation over other devices. The HMDs, available on the market including the Project Glass whose battery lasts only for two hours, have revealed so far poor performance. Professor Yoo said, “Our processor can work for long hours without sacrificing K-Glass’s high performance, an ideal mobile gadget or wearable computer, which users can wear for almost the whole day.”
He further commented:
“HMDs will become the next mobile device, eventually taking over smartphones. Their markets have been growing fast, and it’s really a matter of time before mobile users will eventually embrace an optical see-through HMD as part of their daily use. Through augmented reality, we will have richer, deeper, and more powerful reality in all aspects of our life from education, business, and entertainment to art and culture.”
The KAIST team presented a research paper at the International Solid-State Circuits Conference (ISSCC) held on February 9-13, 2014 in San Francisco, CA, which is entitled “1.22TOPS and 1.52mW/MHz Augmented Reality Multi-Core Processor with Neural Network NoC for HMD Applications.”For further inquires:
Micropatterning OLEDs using electron beam technology
27.04.2016 | Fraunhofer-Institut für Organische Elektronik, Elektronenstrahl- und Plasmatechnik FEP
Quantum computing closer as RMIT drives towards first quantum data bus
18.04.2016 | RMIT University
Using an ultra fast-scanning atomic force microscope, a team of researchers from the University of Basel has filmed “living” nuclear pore complexes at work for the first time. Nuclear pores are molecular machines that control the traffic entering or exiting the cell nucleus. In their article published in Nature Nanotechnology, the researchers explain how the passage of unwanted molecules is prevented by rapidly moving molecular “tentacles” inside the pore.
Using high-speed AFM, Roderick Lim, Argovia Professor at the Biozentrum and the Swiss Nanoscience Institute of the University of Basel, has not only directly...
If a person pushes a broken-down car alone, there is a certain effect. If another person helps, the result is the sum of their efforts. If two micro-particles are pushing another microparticle, however, the resulting effect may not necessarily be the sum their efforts. A recent study published in Nature Communications, measured this odd effect that scientists call “many body.”
In the microscopic world, where the modern miniaturized machines at the new frontiers of technology operate, as long as we are in the presence of two...
Researchers from the Max Planck Institute Stuttgart have developed self-propelled tiny ‘microbots’ that can remove lead or organic pollution from contaminated water.
Working with colleagues in Barcelona and Singapore, Samuel Sánchez’s group used graphene oxide to make their microscale motors, which are able to adsorb lead...
Neutron scattering and computational modeling have revealed unique and unexpected behavior of water molecules under extreme confinement that is unmatched by any known gas, liquid or solid states.
In a paper published in Physical Review Letters, researchers at the Department of Energy's Oak Ridge National Laboratory describe a new tunneling state of...
Honeycomb structures as the basic building block for industrial applications presented using holo pyramid
Researchers of the Alfred Wegener Institute (AWI) will introduce their latest developments in the field of bionic lightweight design at Hannover Messe from 25...
27.04.2016 | Event News
15.04.2016 | Event News
12.04.2016 | Event News
03.05.2016 | Physics and Astronomy
03.05.2016 | Life Sciences
03.05.2016 | Physics and Astronomy