Devices that deeply involve one’s senses and may create an altered mental state: immersive media; immersive 3-D environments. Heads-up Displays, Kinect, Video Mapping, iPads, slates, tables, handhelds, and smart phones all have great reality mixing potential.
The Laster SeeThru is lightweight (just under 2 oz.) wireless augmented reality (AR) eyewear. When you wear the SeeThru, information about your surroundings pops up without disrupting your normal field of vision. The information you see changes depending on what you are seeing. For instance, if you’re looking at a mountain chain, information about each peak can pop up alongside the landscape as you take it in. This kind of contextual information gives you a better awareness of your surroundings. There’s no looking up or down at tiny screens in the corner of your glasses with the SeeThru. Look the world straight in the eye, and the SeeThru will support you, seamlessly.
The Laster SeeThru is the first genuine wireless augmented reality glasses device. Your smartphone acts as the SeeThru’s processor. The two devices connect wirelessly via Bluetooth.
The SeeThru offers unrivaled AR applications with a full patented optical see-through technology. Augmented reality contextual information is overlaid directly onto the object you’re looking at without any image distortion, thanks to the SeeThru’s transparent lens. Compare this with other AR devices, where contextual information is usually displayed on a separate intermediary screen after taking a separate video capture.
With up to 8 hours of battery life, the SeeThru is the best way to experience AR all the day. LASTER kept energy use low by using only Bluetooth to communicate with your smartphone to produce and transmit AR content. This architecture reduces not only the SeeThru’s energy use, but also its overall cost.
And to protect privacy, LASTER decided not to include a camera or recording capabilities in the SeeThru (no spy glasses here!). Instead, the SeeThru’s AR capabilities and tracking are supported by 10 built-in location and GPS sensors.
To provide all of that AR contextual information, LASTER has embedded the most accurate sensors on the market (3 gyroscopes, 3 accelerometers, 3 compasses), and use your Smartphone’s processor to determine your location and what you are seeing.
Innovega’s wearable transparent heads-up display, enabled by iOptik contact lens technology, delivers mega-pixel content with a panoramic field-of-view. This high-performance and stylish eyewear is perfectly suited for the enjoyment of immersive personal media. The first part of the video is a CGI compilation provided by CONNECT, San Diego and the second part is actual footage through our system.
iOptiks are contact lenses that enhance your normal vision by allowing you to view virtual and augmented reality images without the use of any bulky apparatus. Instead of oversized VR helmets, digital images are projected onto tiny displays in full color that sit very near to your eye.
iOptik lenses enhance your normal vision within the confines of your actual eye via the contact lens, the resulting effect allows for very real immersive 3-D large screen images.
Of course it isn’t just 3-D images that iOptiks can project. Innovega says that the applications for iOptiks go beyond simple movie viewing. While the micro-display can be occluded to allow for highly immersive 3-D images similar to what you would experience at the movies, it can also be used for 3-D gaming. You will even be able to utilize a “transparent display for augmented reality applications”.
iOptik Lens by Innovega were demonstrated at Innovega 2012. This contact lenses with nanotechnology that, when combined with a special set of glasses, allows one to focus close to read a heads-up display projected on the glasses, while seeing far. They can also be used for delivering full-field 3D or for 360 degree Gaming Experience.
In this video, Randall Sprague, CTO of Innovega, explains how this device works and potential applications.
Innovega Visualizing The Digital World.
2014 CES: Innovega Staff Wear Mega-pixel Panoramic Eyeglasses
Designers break media-bottleneck by using modern contact lenses
SEATTLE, WA., January 6, 2014 — Innovega Inc., developer of full field of view HUD eyeglasses, announced today that its staff will be wearing prototypes of its mega-pixel eyewear at its booth at 2014 CES. Steve Willey, Innovega CEO, explains, “at last year’s CES event we demonstrated new eyewear optics that offered to the wearer a clear and simultaneous view of both their personal digital media and of their immediate surroundings (http://youtu.be/-_sdoaemQ-k). The big news for 2014 is that our team has succeeded in advancing the platform from feasibility demonstration to wearable, contact lens-enabled, full-function, mega-pixel eyewear. Though 2013 represented an exciting launch of ‘wearable technology’ and ‘the Internet of things’, neither will gain traction without development of powerful user interfaces. Innovega staff will demonstrate our ability to fill this need by wearing the industry’s first rich-media eyeglasses at Booth # 70103 in the Venetian Hotel.”
The Innovega iOptik™ platform provides wearers a ‘virtual canvas’ on which any media can be viewed or application run. The prototypes will feature up to six times the number of pixels and forty-six times the screen size of mobile products that rely on designs limited by conventional optics. Our optics deliver games that are truly “immersive”, movies that mimic IMAX performance, a multi-tasking dashboard that incorporates five or more typical screens – all while simultaneously providing the wearer a safe and clear view of their environment.
Innovega provides second-generation components, core technology and reference designs that enable its OEM customers to develop new generations of high-performance, digital eyewear. Its novel iOptik™ architecture improves comfort and styling by removing all of bulky and heavy focusing optics from the eyewear. Its application of a modern soft contact lens yields an immediate panoramic field of view that enables immersive entertainment or benefits from multiple, active windows, simultaneous with a continuous view of the wearer’s real world. Innovega’s use of conventional, transparent and stylish eyeglasses eliminates the social barrier that traditional wearable displays have created. Innovega maintains offices in Seattle, WA. and in San Diego, CA.
Source: Innovega Inc. Contact: Steve Willey (425) 516-8175
Ever wished your computer could respond to your thoughts? Good news — it can. Get ready to leap into a new world with Tobii EyeX. Adding eye tracking to the action makes things fast, fun and totally intuitive. You control games like you’re in them. You zoom where you look. Text scrolls as you read. You are always in the right place.
Experience computer interaction with eye tracking by Tobii. This video shows some of the core interactions . And some experiences that are yet to be developed.
Tobii and SteelSeries team up to bring gamers the world’s first eye tracking gaming gear. Be first in creating the future of gaming with eye tracking.
Eye tracking increases the bandwidth between the gamer and the game, allowing gamers to do more at the same time, which also creates a richer gaming experience. Add an extra aiming mechanism, remove the interruption of the game play by creating easier access to menus and commands, or make games with complex controls easier to learn.
Google is working on a set of HUD, (heads-up display), glasses, they are now in prototype phase and will enable users to tap into Google’s cloud services through augmented reality. Here 9to5Google Explains…
They are in late prototype stages of wearable glasses that look like thick-rimmed glasses that “normal people” wear. However, these provide a display with a heads up computer interface. There are a few buttons on the arms of the glasses, but otherwise, they could be mistaken for normal glasses. Additionally, we are not sure of the technology being employed here, but it is likely a transparent LCD or AMOLED display such as the one demonstrated below: In addition, we have heard that this device is not an “Android peripheral” as the NYT stated. According to our source, it communicates directly with the Cloud over IP. Although, the “Google Goggles” could use a phone’s Internet connection, through Wi-Fi or a low power Bluetooth 4.0. The use-case is augmented reality that would tie into Google’s location services. A user can walk around with information popping up and into display -Terminator-style- based on preferences, location and Google’s information. Therefore, these things likely connect to the Internet and have GPS. They also likely run a version of Android.
Since then, we have learned much more regarding Google’s glasses…
Our tipster has now seen a prototype and said it looks something like Oakley Thumps (below). These glasses, we heard, have a front-facing camera used to gather information and could aid in augmented reality apps. It will also take pictures. The spied prototype has a flash —perhaps for help at night, or maybe it is just a way to take better photos. The camera is extremely small and likely only a few megapixels.
The heads up display (HUD) is only for one eye and on the side. It is not transparent nor does it have dual 3D configurations, as previously speculated.
One really cool bit: The navigation system currently used is a head tilting-to scroll and click. We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.
(As an aside, I built a head mouse as a Masters Thesis project a few years back that used head tilts to navigate and control menus. I am ready to collect royalties!)
I/O on the glasses will also include voice input and output, and we are told the CPU/RAM/storage hardware is near the equivalent of a generation-old Android smartphone. As a guess, we would speculate something like 1GHz ARM A8, 256MB RAM and 8GB of storage? In any case, it will also function as a smartphone.
Perhaps most interesting is that Google is currently deciding on how it wants to release these glasses, even though the product is still a very long way from being finished. It is currently a secret with only a few geeky types knowing about it, and Google is apparently unsure if it will have mass-market appeal. Therefore, the company is considering making this a pilot program, somewhat like the Cr-48 Chromebooks last year.
Yes, Google might actually release this product as beta-pilot program to people outside of Google—and soon.
FYI Motorola’s got something cool in this area brewing as well.
Another quick hack using the Kinect beta SDK and my new Windows Phone (which is great!). What you see is a simple game engine utilizing the pseudo-holographic effect from my other videos. A Kinect “sees” the position of the viewer and the 3D engine adjusts the image accordingly to give the illusion of a real 3D object. The 3D engine supports anaglyph 3D (red/cyan glasses) for a better effect in real life. A simple WP7 app controls the application and the helicopter using the accelerometers of the phone. (Source – If you like it, check out my other videos. Thanks for watching! )
Ah, lasers. Those wonderful, super intense beams of light that we've seen used in headlights, projectors, and naturally, death rays. Like us, researchers at the Niels Bohr Institute at the University of Copenhagen figure there's nothing lasers can't do, and have figured out a way to use them to cool a bit of semiconducting material. This bit of black magic works using a membrane made of gallium arsenide and is based upon principles of quantum physics and optomechanics (the interaction between light and mechanical motion).
Turns out, when a one millimeter square membrane of gallium arsenide is placed parallel to a mirror in a vacuum chamber and bombarded with a laser beam, an optical resonator is created between them that oscillates the membrane. As the distance between the gallium arsenide and the mirror changes, so do the membrane's oscillations. And, at a certain frequency, the membrane is cooled to minus 269 degrees Celsius -- despite the fact that the membrane itself is being heated by the laser. So, lasers can both heat things up and cool them down simultaneously, and if that confuses you as much as it does us, feel free to dig into the science behind this paradoxical bit of research at the source below. In other news, left is right, up is down, and Eli Manning is a beloved folk hero to all Bostonians.
Pico projectors are an easy way to increase the screen real estate of your mobile phone, but what if you'd rather not carry one around in your pocket or bulk up your phone's slim profile with a slip on solution? Well, a team of intrepid researchers may have come up with an elegant solution to your problem that can work with any smartphone and external display: virtual projection. The system works by using a central server that constantly takes screenshots of the external display and compares them with the images from the phone's camera to track its location. It then replicates what's on the handset's screen, while allowing you to add multiple image windows and position and rotate them as you see fit. Additionally, multiple users can collaborate and virtually project pictures or videos onscreen at the same time. Intrigued? See it in action for yourself in the video after the break. Continue reading... Researchers turn your smartphone into a virtual projector
Well that didn't take long. Shortly after getting our grubby mitts on the AT&T variant of Samsung's Galaxy Note at CES, the jumbo phone has made its way into the loving arms of Uncle Sam at the FCC. Naturally, it's not advertised as such, but test documents reveal that a model SGH-i717 handset packing UMTS/HSPA+ (21Mbps) and GSM/EDGE world radios, plus Ma Bell-friendly bands 4 and 17 LTE has passed the FCC's emissions tests with flying colors. So, now that it's got the governmental stamp of approval, all that's left is to find out when we can make with the S Pen action on AT&T's newly minted high speed network. Don't keep us waiting, guys.
The recent announcement by a British medical ethics board in favor of an experimental three-parent IVF treatment--wherein the genetic material from three donors, not the usual two, is used to create a fetus--and has once again stirred the pot of reproductive controversy. So where exactly is the line...
Robotic Fish and Inflatable Tentacles: How MIT is Solving Hard Problems with Soft Robotssoft robotsBy Erik SofgePosted 03.14.2014 at 5:30 pm 0 This is a soft-bodied, inflatable robotic fish developed at MIT that uses carbon dioxide, as opposed to motors, to pull off agile underwater maneuvers. Melan...
Designed by German engineering firm Festo, these claw-tipped, artificially intelligent arms were designed to mimic the utility and movement of an elephant's trunk – but the resemblance to Dock Ock's writhing limbs is just uncanny.