Devices that deeply involve one’s senses and may create an altered mental state: immersive media; immersive 3-D environments. Heads-up Displays, Kinect, Video Mapping, iPads, slates, tables, handhelds, and smart phones all have great reality mixing potential.
Google is working on a set of HUD, (heads-up display), glasses, they are now in prototype phase and will enable users to tap into Google’s cloud services through augmented reality. Here 9to5Google Explains…
They are in late prototype stages of wearable glasses that look like thick-rimmed glasses that “normal people” wear. However, these provide a display with a heads up computer interface. There are a few buttons on the arms of the glasses, but otherwise, they could be mistaken for normal glasses. Additionally, we are not sure of the technology being employed here, but it is likely a transparent LCD or AMOLED display such as the one demonstrated below: In addition, we have heard that this device is not an “Android peripheral” as the NYT stated. According to our source, it communicates directly with the Cloud over IP. Although, the “Google Goggles” could use a phone’s Internet connection, through Wi-Fi or a low power Bluetooth 4.0. The use-case is augmented reality that would tie into Google’s location services. A user can walk around with information popping up and into display -Terminator-style- based on preferences, location and Google’s information. Therefore, these things likely connect to the Internet and have GPS. They also likely run a version of Android.
Since then, we have learned much more regarding Google’s glasses…
Our tipster has now seen a prototype and said it looks something like Oakley Thumps (below). These glasses, we heard, have a front-facing camera used to gather information and could aid in augmented reality apps. It will also take pictures. The spied prototype has a flash —perhaps for help at night, or maybe it is just a way to take better photos. The camera is extremely small and likely only a few megapixels.
The heads up display (HUD) is only for one eye and on the side. It is not transparent nor does it have dual 3D configurations, as previously speculated.
One really cool bit: The navigation system currently used is a head tilting-to scroll and click. We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.
(As an aside, I built a head mouse as a Masters Thesis project a few years back that used head tilts to navigate and control menus. I am ready to collect royalties!)
I/O on the glasses will also include voice input and output, and we are told the CPU/RAM/storage hardware is near the equivalent of a generation-old Android smartphone. As a guess, we would speculate something like 1GHz ARM A8, 256MB RAM and 8GB of storage? In any case, it will also function as a smartphone.
Perhaps most interesting is that Google is currently deciding on how it wants to release these glasses, even though the product is still a very long way from being finished. It is currently a secret with only a few geeky types knowing about it, and Google is apparently unsure if it will have mass-market appeal. Therefore, the company is considering making this a pilot program, somewhat like the Cr-48 Chromebooks last year.
Yes, Google might actually release this product as beta-pilot program to people outside of Google—and soon.
FYI Motorola’s got something cool in this area brewing as well.
Another quick hack using the Kinect beta SDK and my new Windows Phone (which is great!). What you see is a simple game engine utilizing the pseudo-holographic effect from my other videos. A Kinect “sees” the position of the viewer and the 3D engine adjusts the image accordingly to give the illusion of a real 3D object. The 3D engine supports anaglyph 3D (red/cyan glasses) for a better effect in real life. A simple WP7 app controls the application and the helicopter using the accelerometers of the phone. (Source – If you like it, check out my other videos. Thanks for watching! )
Ah, lasers. Those wonderful, super intense beams of light that we’ve seen used in headlights, projectors, and naturally, death rays. Like us, researchers at the Niels Bohr Institute at the University of Copenhagen figure there’s nothing lasers can’t do, and have figured out a way to use them to cool a bit of semiconducting material. This bit of black magic works using a membrane made of gallium arsenide and is based upon principles of quantum physics and optomechanics (the interaction between light and mechanical motion).
Turns out, when a one millimeter square membrane of gallium arsenide is placed parallel to a mirror in a vacuum chamber and bombarded with a laser beam, an optical resonator is created between them that oscillates the membrane. As the distance between the gallium arsenide and the mirror changes, so do the membrane’s oscillations. And, at a certain frequency, the membrane is cooled to minus 269 degrees Celsius — despite the fact that the membrane itself is being heated by the laser. So, lasers can both heat things up and cool them down simultaneously, and if that confuses you as much as it does us, feel free to dig into the science behind this paradoxical bit of research at the source below. In other news, left is right, up is down, and Eli Manning is a beloved folk hero to all Bostonians.
Pico projectors are an easy way to increase the screen real estate of your mobile phone, but what if you’d rather not carry one around in your pocket or bulk up your phone’s slim profile with a slip on solution? Well, a team of intrepid researchers may have come up with an elegant solution to your problem that can work with any smartphone and external display: virtual projection. The system works by using a central server that constantly takes screenshots of the external display and compares them with the images from the phone’s camera to track its location. It then replicates what’s on the handset’s screen, while allowing you to add multiple image windows and position and rotate them as you see fit. Additionally, multiple users can collaborate and virtually project pictures or videos onscreen at the same time. Intrigued? See it in action for yourself in the video after the break. Continue reading… Researchers turn your smartphone into a virtual projector
Well that didn’t take long. Shortly after getting our grubby mitts on the AT&T variant of Samsung’s Galaxy Note at CES, the jumbo phone has made its way into the loving arms of Uncle Sam at the FCC. Naturally, it’s not advertised as such, but test documents reveal that a model SGH-i717 handset packing UMTS/HSPA+ (21Mbps) and GSM/EDGE world radios, plus Ma Bell-friendly bands 4 and 17 LTE has passed the FCC’s emissions tests with flying colors. So, now that it’s got the governmental stamp of approval, all that’s left is to find out when we can make with the S Pen action on AT&T’s newly minted high speed network. Don’t keep us waiting, guys.
We love just about anything involving lasers or robotics here at Engadget, so naturally, we’re intrigued by Sriranjan Rasakatla’s Way-Go flashlight that combines the two. It’s comprised of a laser pico projector, GPS module, altitude and heading reference system (AHRS) to not only light your path but also tell you which way to go. It can be used strictly as a flashlight, but users can also input starting and destination points to have the Way-Go guide them. There’s also a wander mode that displays info about your surroundings as you stroll around — though naturally, such information must be pre-programmed into the device. Because it displays stuff that needs reading, the projector’s connected to servos that can keep it locked on a projection point to keep it readable no matter how much you move the Way-Go around. Rasakatla sees the device being useful in search and rescue, backcountry trekking, and campus tour guiding — odd, ’cause in our day, kids walking around campus at night were trying to find out where the party was at, not learn about the architecture of the academic buildings. Regardless, you can see the Way-Go in action after the break.
There’s a huge problem with working out that has yet to be solved: when, precisely, do our workout clothes become too worn to wear anymore? Apple knows we can’t be wasting endless minutes looking for holes and tears in our shirts and pants, so it’s just obtained a method patent to let you know when your gear is past its prime. The patent claims sensor-equipped garments that can track how you use them, report that info back to a central database and alert you when the clothing has reached “its expected useful lifetime.” (Read: it’s time to buy some new, undoubtedly more expensive gym clothes.) This latest bit of IP doesn’t just cover clothing either, Cupertino’s claiming the same method for running shoes, too. The footwear bit also provides real-time feedback that compares your current running style to an established profile to keep your workouts consistent — useful feature, that, though we can’t imagine such iShoes would make the folks in Niketown too happy. We’re not sure how Apple aims to make the needed wearables equipped with embedded electronics, but we can offer you plenty of typically broad patent legalese explaining the system that’ll get you buying them at the source below.
Remember those wicked holographic augmented reality glasses that DARPA was so hot to build? They’re almost here. Hiding out at Vuzix’s CES booth we found a functional prototype for its Smart Glasses industrial class monocular display — a special lens attached to a proprietary display driver that produces a bright, 1.4mm holographic picture for one of your peepers. Vuzix told us the lenses were the fruit of a DARPA project, and could allow soldiers involved in air-to-surface operations to track jets, check their ordinance and mark targets for destruction. The military / industrial monocle will go on sale in Q3 of 2012 for somewhere between $2500-3000.
Want to look a little more, well, normal while you’re augmenting your reality? You’re covered — or at least you will be in 2013. Not only will Vuzix’s consumer facing smart glasses offer you the same holographic heads-up technology that’ll power its military bound brother, it’ll cost you a bundle less, too: between $350-600. The unit we saw wasn’t final, but were told the final unit will be able to accept connections over HDMI, and may even be capable of displaying stereoscopic 3D content — you know, in case the real world wasn’t real enough. Hopefully, we’ll be able to tell you those fit next year. Ready to see how you’ll be gussying up reality in the future? Hit the break for our hands-on video coverage.