Posts Tagged ‘augmented reality’

SeeThru By LASTER

January 24, 2014

The Laster SeeThru is lightweight (just under 2 oz.) wireless augmented reality (AR) eyewear. When you wear the SeeThru, information about your surroundings pops up without disrupting your normal field of vision. The information you see changes depending on what you are seeing. For instance, if you’re looking at a mountain chain, information about each peak can pop up alongside the landscape as you take it in. This kind of contextual information gives you a better awareness of your surroundings. There’s no looking up or down at tiny screens in the corner of your glasses with the SeeThru. Look the world straight in the eye, and the SeeThru will support you, seamlessly.

13b11e5d835725c3c6e27f847f9206e7_large

The Laster SeeThru is the first genuine wireless augmented reality glasses device. Your smartphone acts as the SeeThru’s processor. The two devices connect wirelessly via Bluetooth.

The SeeThru offers unrivaled AR applications with a full patented optical see-through technology. Augmented reality contextual information is overlaid directly onto the object you’re looking at without any image distortion, thanks to the SeeThru’s transparent lens. Compare this with other AR devices, where contextual information is usually displayed on a separate intermediary screen after taking a separate video capture.

With up to 8 hours of battery life, the SeeThru is the best way to experience AR all the day. LASTER kept energy use low by using only Bluetooth to communicate with your smartphone to produce and transmit AR content. This architecture reduces not only the SeeThru’s energy use, but also its overall cost.

And to protect privacy, LASTER decided not to include a camera or recording capabilities in the SeeThru (no spy glasses here!). Instead, the SeeThru’s AR capabilities and tracking are supported by 10 built-in location and GPS sensors.

To provide all of that AR contextual information, LASTER has embedded the most accurate sensors on the market (3 gyroscopes, 3 accelerometers, 3 compasses), and use your Smartphone’s processor to determine your location and what you are seeing.

Get It On Kickstarter! http://kck.st/19tDbMV

iOptik – a glimpse into the future

January 5, 2014


Innovega’s wearable transparent heads-up display, enabled by iOptik contact lens technology, delivers mega-pixel content with a panoramic field-of-view. This high-performance and stylish eyewear is perfectly suited for the enjoyment of immersive personal media. The first part of the video is a CGI compilation provided by CONNECT, San Diego and the second part is actual footage through our system.


iOptiks are contact lenses that enhance your normal vision by allowing you to view virtual and augmented reality images without the use of any bulky apparatus. Instead of oversized VR helmets, digital images are projected onto tiny displays in full color that sit very near to your eye.

iOptik lenses enhance your normal vision within the confines of your actual eye via the contact lens, the resulting effect allows for very real immersive 3-D large screen images.

Of course it isn’t just 3-D images that iOptiks can project. Innovega says that the applications for iOptiks go beyond simple movie viewing. While the micro-display can be occluded to allow for highly immersive 3-D images similar to what you would experience at the movies, it can also be used for 3-D gaming. You will even be able to utilize a “transparent display for augmented reality applications”.

iOptik Lens by Innovega were demonstrated at Innovega 2012. This contact lenses with nanotechnology that, when combined with a special set of glasses, allows one to focus close to read a heads-up display projected on the glasses, while seeing far. They can also be used for delivering full-field 3D or for 360 degree Gaming Experience.

In this video, Randall Sprague, CTO of Innovega, explains how this device works and potential applications.

Innovega Visualizing The Digital World.

Innovega Visualizing The Digital World.

2014 CES: Innovega Staff Wear Mega-pixel Panoramic Eyeglasses
Designers break media-bottleneck by using modern contact lenses

SEATTLE, WA., January 6, 2014 — Innovega Inc., developer of full field of view HUD eyeglasses, announced today that its staff will be wearing prototypes of its mega-pixel eyewear at its booth at 2014 CES. Steve Willey, Innovega CEO, explains, “at last year’s CES event we demonstrated new eyewear optics that offered to the wearer a clear and simultaneous view of both their personal digital media and of their immediate surroundings (http://youtu.be/-_sdoaemQ-k). The big news for 2014 is that our team has succeeded in advancing the platform from feasibility demonstration to wearable, contact lens-enabled, full-function, mega-pixel eyewear. Though 2013 represented an exciting launch of ‘wearable technology’ and ‘the Internet of things’, neither will gain traction without development of powerful user interfaces. Innovega staff will demonstrate our ability to fill this need by wearing the industry’s first rich-media eyeglasses at Booth # 70103 in the Venetian Hotel.”

The Innovega iOptik™ platform provides wearers a ‘virtual canvas’ on which any media can be viewed or application run. The prototypes will feature up to six times the number of pixels and forty-six times the screen size of mobile products that rely on designs limited by conventional optics. Our optics deliver games that are truly “immersive”, movies that mimic IMAX performance, a multi-tasking dashboard that incorporates five or more typical screens – all while simultaneously providing the wearer a safe and clear view of their environment.

About Innovega Inc.

Innovega provides second-generation components, core technology and reference designs that enable its OEM customers to develop new generations of high-performance, digital eyewear. Its novel iOptik™ architecture improves comfort and styling by removing all of bulky and heavy focusing optics from the eyewear. Its application of a modern soft contact lens yields an immediate panoramic field of view that enables immersive entertainment or benefits from multiple, active windows, simultaneous with a continuous view of the wearer’s real world. Innovega’s use of conventional, transparent and stylish eyeglasses eliminates the social barrier that traditional wearable displays have created. Innovega maintains offices in Seattle, WA. and in San Diego, CA.

Source: Innovega Inc. Contact: Steve Willey (425) 516-8175

From the future with love

September 9, 2013

A movie Written and Directed by K-MICHEL PARANDI

Producers: James Lawler & K-Michel Parandi
Executive Producer: Virgil Price
Co-Executive Producer: Lauren Beck

Production Design & Concept Artist: Ben Mauro – Design art work: K-Michel Parandi & Ben Mauro – Costume Production Design: Julien Richard
Dialogue: K-Michel Parandi & Jack Coulton
Music and Sound Design: Pascal Bonifay (AOC/BOC)

First Assistant Director: Etan Harwayne-Gidansky
Second Assistant Director: Ramde Serolf
Assistant Producer: Paul Jarret
Unit Production Manager: Adam Benlifer

Production Coordinator: Dann Ramirez
Director’s Assistant: Louis Papaloizou

Director Of Photography: Ray Flynn

Cast: Max Kaminsky NYPC: Justin Campbell – Parker: NYPC Chris Beetem – Bodyjacker: Nathan Owen – Young NYPC cop: Tommy Walker – NYPC hungry cop: Mike Falcon / Waitress: Kim Allen – Rami: Roberto Lopez
Thief: Louis Paploizou – MPF Narc: Toby Wilson

Camera Operators: K-Michel Parandi & Ray Flynn
First Assistant Camera: Violetta D’Agata
Second Assistant Camera: Christopher Bye
DIT: Drew Ravani & Stephen Dirkes
Aerial Photography: Marcin Nadolni & Toby Wilson

Steadicam: Amar Ioudarene
Editing Assistants: Max Smith & Tom Klane
Sound Design: Pascal Bonifay & Fabrice Smadja
Audio: AOC/BOC (M. Letaconnoux – S. Weinberg – L. Jokiel – B Mora – M. Singer)
Voice Talents: Kate Clark – Billy-Bob Thompson- Roberto Serrini – Kim Bonifay – Mia Bonifay

Storyboard: Andrew Wendel
Art Dept: Nick Tong – Brian Rzepka – Nola Denett – Nicole Eure
Wardrobe: Marina Lelchuk

VFX by Hectic Electric Amsterdam
VFX Producers: Mark Kubbinga & Patty Veestra
VFX Supervisor: Robbert Lubken
Post Production Services by Moon Dog Edit – New York, in Association with Violet Creative
Colorist: Blasé Theodore
Gaffer: Raina Oberlin

Best Boy Electric: Matt Kessler
Second Electric: Noah Chamis
Third Electric: Brendon Swift
Forth Electric: Albert Phaneuf
Driver / Swing: Rebekka Bjornosdottir
G&E Intern: Deanna Covello – Jack Buckley
Key Grip: Stratton Bailey
Best Boy Grip: Will Gottlieb
Third Grip: Adam Barbay
Forth Grip: Matt Garland
Rig Gaffer: David Duktus

Sound Department: Brian Flood – Oliver Rush

Stunt Coordinator: Roberto Lopez
Stunts: Luciano Acuna – Kenny Wong

Associate Producer: Ray Flynn

Production Assistants: Anthony Salvatori – Curtis Yarlborough – Victor Trejo – Christopher Duchene – Pierre Tissot – Grady Daub – Chelsea Moore – Angel Martinez – Ben Budde – Aldo Rodriguez – Chris Gautsh – Benjamin Budd – Angel Paredes

Drivers & Production Assistants: Patrick Chen – Mikhail Chernikov – Stewart Resmer – Alexander Bragg – Stephen Mitchell – Aido Rodriguez – Ryan Hawk

Still Photographer: Simon Briand

Special Thanks To: Channing Tatum – Reid Carolin – Sandy Morhouse – Rory Haines – Sohrab Noshirvani – Micah Sherman – Hoke Hokansen – Jill McDermid – Rafael Childress – Jon Darman – Brian Zingale – Remi Liebert

Director’s cut. April 2013.

AWE 2013

May 7, 2013

AWE 2013 will be held at the Santa Clara Convention Center, on June 4-5, 2013.

Urban Augmented Reality

May 15, 2012

STREET ART & AUGMENTED REALITY BY GEC-ART & HUB09
GEC-ART and HUB09 Italian artist have created a new project combining Street Art and Augmented Reality. The HUB09‘s augmented reality app allows you to frame your smartphone with the street art in order to see her come to life in unexpected ways …. Interesting indeed!

(Source)

Digital Media SIG Event: Augmented Reality Gets Real | mitforumcambridge.org

February 22, 2012

If virtual reality creates a rich experience within a world that may not exist, then augmented reality (AR) creates a rich experience within the world that actually does. AR overlays relevant digital content on physical environments in real time so you can interact with them in ways that are more interesting and more powerful. Register now to hear how AR is creating cool new applications and exciting new business cases in areas ranging from consumer retail to travel to entertainment and more.

Read More: Digital Media SIG Event: Augmented Reality Gets Real | mitforumcambridge.org
.

Google X HUD

February 11, 2012

Google is working on a set of HUD, (heads-up display), glasses, they are now in prototype phase and will enable users to tap into Google’s cloud services through augmented reality. Here 9to5Google Explains…

We detailed the first information about the Google [x] Glasses project in December.

They are in late prototype stages of wearable glasses that look like thick-rimmed glasses that “normal people” wear. However, these provide a display with a heads up computer interface. There are a few buttons on the arms of the glasses, but otherwise, they could be mistaken for normal glasses. Additionally, we are not sure of the technology being employed here, but it is likely a transparent LCD or AMOLED display such as the one demonstrated below: In addition, we have heard that this device is not an “Android peripheral” as the NYT stated. According to our source, it communicates directly with the Cloud over IP. Although, the “Google Goggles” could use a phone’s Internet connection, through Wi-Fi or a low power Bluetooth 4.0. The use-case is augmented reality that would tie into Google’s location services. A user can walk around with information popping up and into display -Terminator-style- based on preferences, location and Google’s information. Therefore, these things likely connect to the Internet and have GPS. They also likely run a version of Android.

Since then, we have learned much more regarding Google’s glasses…
Our tipster has now seen a prototype and said it looks something like Oakley Thumps (below). These glasses, we heard, have a front-facing camera used to gather information and could aid in augmented reality apps. It will also take pictures. The spied prototype has a flash —perhaps for help at night, or maybe it is just a way to take better photos. The camera is extremely small and likely only a few megapixels.

The heads up display (HUD) is only for one eye and on the side. It is not transparent nor does it have dual 3D configurations, as previously speculated.

One really cool bit: The navigation system currently used is a head tilting-to scroll and click. We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.

(As an aside, I built a head mouse as a Masters Thesis project a few years back that used head tilts to navigate and control menus. I am ready to collect royalties!)
I/O on the glasses will also include voice input and output, and we are told the CPU/RAM/storage hardware is near the equivalent of a generation-old Android smartphone. As a guess, we would speculate something like 1GHz ARM A8, 256MB RAM and 8GB of storage? In any case, it will also function as a smartphone.

Perhaps most interesting is that Google is currently deciding on how it wants to release these glasses, even though the product is still a very long way from being finished. It is currently a secret with only a few geeky types knowing about it, and Google is apparently unsure if it will have mass-market appeal. Therefore, the company is considering making this a pilot program, somewhat like the Cr-48 Chromebooks last year.

Yes, Google might actually release this product as beta-pilot program to people outside of Google—and soon.

FYI Motorola’s got something cool in this area brewing as well.


(Source)

Windows Phone and Kinect to create HOLOGRAPHIC game engine

January 25, 2012


Another quick hack using the Kinect beta SDK and my new Windows Phone (which is great!). What you see is a simple game engine utilizing the pseudo-holographic effect from my other videos. A Kinect “sees” the position of the viewer and the 3D engine adjusts the image accordingly to give the illusion of a real 3D object. The 3D engine supports anaglyph 3D (red/cyan glasses) for a better effect in real life. A simple WP7 app controls the application and the helicopter using the accelerometers of the phone.  (Source — If you like it, check out my other videos. Thanks for watching! )

Researchers turn your smartphone into a virtual projector

January 22, 2012

Pico projectors are an easy way to increase the screen real estate of your mobile phone, but what if you'd rather not carry one around in your pocket or bulk up your phone's slim profile with a slip on solution? Well, a team of intrepid researchers may have come up with an elegant solution to your problem that can work with any smartphone and external display: virtual projection. The system works by using a central server that constantly takes screenshots of the external display and compares them with the images from the phone's camera to track its location. It then replicates what's on the handset's screen, while allowing you to add multiple image windows and position and rotate them as you see fit. Additionally, multiple users can collaborate and virtually project pictures or videos onscreen at the same time. Intrigued? See it in action for yourself in the video after the break. Continue reading... Researchers turn your smartphone into a virtual projector

Researchers turn your smartphone into a virtual projector originally appeared on Engadget on Sun, 22 Jan 2012 12:37:00 EDT. Please see our terms for use of feeds.

Permalink   | (source) Dominikus Baur  | Email this | Comments

GM Advanced Tech Window

January 21, 2012


Got backseat boredom? DVD players and Game Boys are so five years ago, but a new concept in rear seat entertainment technology that uses the windows themselves could replace squirminess and snoozing with interactive scribbling, sweeping and pinching.

General Motors Research and Development put that challenge before researchers and students from the FUTURE LAB at Bezalel Academy of Art and Design in Israel. The task: Conceptualize new ways to help rear seat passengers, particularly children, have a richer experience on the road.

The Windows of Opportunity (WOO) Project was inspired by psychological studies indicating car passengers often feel disconnected from their environment, GM asked the Bezalel students to turn car windows into interactive displays capable of stimulating awareness, nurturing curiosity and encouraging a stronger connection with the world outside the vehicle.

“Traditionally, the use of interactive displays in cars has been limited to the driver and front passenger, but we see an opportunity to provide a technology interface designed specifically for rear seat passengers,” said Tom Seder, GM R&D lab group manager for human-machine interface. “Advanced windows that are capable of responding to vehicle speed and location could augment real world views with interactive enhancements to provide entertainment and educational value.”

Since GM has no immediate plans to put interactive display windows into production vehicles, the R&D team gave free reign to the Bezalel students to create applications without concern whether they could be mass produced. Bezalel is Israel’s oldest institute of higher education and one of the more prestigious schools of its kind in the world. (Source)

TI Forges Ahead In Augmented Reality On Its OMAP Platform

January 15, 2012

Reiterating its commitment to fuel best-in-class user experiences, Texas Instruments Incorporated (TI) today underscored strategic relationships with metaio and Total Immersion, aimed at bringing augmented reality (AR) capabilities to life on TI’s market-leading OMAP processors.

Both companies are providing AR software development kits (SDK), optimized to work on TI’s OMAP processors, which now makes it easier than ever to implement next-generation, immersive AR applications. This time, the collaborative efforts place the smart-multicore OMAP processors at the heart of award-winning AR advancements, and bring breakthrough AR design capabilities to a broader range of OEMs and developers. Exciting apps built using these SDKs are on display this week at the Consumer Electronics Show (CES) in TI’s meeting space (N116, North Hall).

“Our strategic partnerships with metaio and Total Immersion enable their AR SDKs to leverage the on-chip dedicated camera sub-system and hardware-accelerated computer vision libraries unique to the OMAP platform’s smart multicore architecture,” said Fred Cohen, director, OMAP user experience team, TI. “These innovators are at the forefront of their industry. In addition to differentiated technical capabilities, our work with metaio and Total Immersion introduces an unprecedented set of tools and access as well as professional support from each company, empowering developers to bring a new era of AR-based eCommerce applications to life.”

The OMAP-processor-optimized metaio Mobile SDK includes patented gravity awareness visual tracking technology for 2D and 3D objects, which ensure more natural, intuitive and realistic AR experiences. Total Immersion’s D’Fusion AR platform, leverages the OMAP platform’s processing speed for lightning-fast image recognition, rendering capability and extraordinary tracking abilities.

“We are thrilled to work with TI to make it easier and faster for developers to enable the most sought-after, futuristic AR capabilities imaginable. Our new gravity awareness feature and award-winning visual tracking technology for 2D and 3D objects pair with the OMAP processor to deliver natural, intuitive AR features that consumers demand.” – Dr. Thomas Alt, CEO, metaio.

“Collaborating with TI on our AR SDK ensures that D’Fusion® offers a best-in-class AR solution for mobile AR development. The OMAP processors have what our developers demand in terms of performance and optimization with TI’s smart multicore OMAP architecture. It makes existing AR applications better and faster, and will also enable new and exciting apps in markets thirsty for what AR capabilities have to offer.” – Bruno Uzzan, CEO and Co-Founder, Total Immersion

Availability
These SDKs are available today to developers and customers wanting to bring a higher quality, higher performing, lower power AR experience to mobile devices.

Visit metaio’s site to download the free SDK here: http://www.metaio.com/software/mobile-sdk/.

Visit Total Immersion’s site to download the free D’Fusion® SDK here: https://community.t-immersion.com and join Total Immersion’s developer community.

Source: TI Forges Ahead In Augmented Reality On Its OMAP Platform.

Vuzix augmented reality Smart Glasses prototype hands-on (video) | Tablet PC Comparison

January 13, 2012

Remember those wicked holographic augmented reality glasses that DARPA was so hot to build? They’re almost here. Hiding out at Vuzix’s CES booth we found a functional prototype for its Smart Glasses industrial class monocular display — a special lens attached to a proprietary display driver that produces a bright, 1.4mm holographic picture for one of your peepers. Vuzix told us the lenses were the fruit of a DARPA project, and could allow soldiers involved in air-to-surface operations to track jets, check their ordinance and mark targets for destruction. The military / industrial monocle will go on sale in Q3 of 2012 for somewhere between $2500-3000.

Want to look a little more, well, normal while you’re augmenting your reality? You’re covered — or at least you will be in 2013. Not only will Vuzix’s consumer facing smart glasses offer you the same holographic heads-up technology that’ll power its military bound brother, it’ll cost you a bundle less, too: between $350-600. The unit we saw wasn’t final, but were told the final unit will be able to accept connections over HDMI, and may even be capable of displaying stereoscopic 3D content — you know, in case the real world wasn’t real enough. Hopefully, we’ll be able to tell you those fit next year. Ready to see how you’ll be gussying up reality in the future? Hit the break for our hands-on video coverage.

Vuzix augmented reality Smart Glasses prototype hands-on (video) | Tablet PC Comparison.


Joseph Volpe contributed to this report.

Aurasma Visual Browser

January 13, 2012

By Bill Weir, C. Michael Kim & David Miller — This Could Be Big

Amidst the massive buffet of electronic goodies that stuffs the annual techno fest that is the Consumer Electronics Show, we found – or rather, were found by – a company with an innovative new application that could truly change the way we utilize our phones and tablets, and even the way we look at the world. The company is Aurasma and their augmented reality application will forever change the way you look at a twenty dollar bill.

Aurasma is a free open source Android, iPhone and iPad app that allows you to interact with the world in a whole new way. You simply hold your phone up to a still image thats been augmented using Aurasma and it transforms into anything from a video, animation, or in the case of a Harry Potter poster, a movie trailer. But it doesn’t simply connect to a link like a QR code, instead it’s as if the movie poster or the $20 bill comes alive on the screen, like in the movie Jumanji.

Expect to here more from Aurasma in the months to come. They’ve been one of the most talked about innovations at this year’s show, and are a finalist in CNET’s Best of CES awards.

Available on smartphones, the app was created out of technology that is capable of recognizing images, symbols and objects in the real world and understanding them. It can then deliver relevant content in real time, including videos, animations, audio or webpages.
Using the smartphone’s camera, GPS, compass, accelerometer and internet connection, the technology combines image recognition and a conceptual understanding of the 3D world to recognize objects and images and seamlessly merge augmented reality actions into the scene. Without the need for barcodes or tags, the app is able to see its surrounding environment and make it fully interactive.

http://www.aurasma.com/