Posted on November 12, 2012 in Robotics by dulce303
URBI, the open source operating system for robots, is distributed under the BSD license
URBI, the first operating system dedicated for robotics, and urbiscript, the programming language, are now available under the BSD license, in order to be widely spread.
(October 10, 2012 – IROS, Vilamoura, Portugal) ALDEBARAN Robotics, the worldwide leader in humanoid robotics, announced at the IROS 2012 International Conference, the distribution of the URBI operating system and the urbiscript language under a BSD software license.
Developed in 2003 by Gostai, a software development company that recently joined the ALDEBARAN Robotics Group, the URBI and urbiscript open source package is intended for robotics developers and integrators, and by extension for the world of computer programming.
The BSD license will allow the urbiscript language to spread more easily across developer communities and enable enhanced integration with other development tools under a BSD license.
With this approach, ALDEBARAN Robotics seeks to promote the use of URBI to facilitate parallel operation of multiple components on existing robotics platforms, and allows users of a wide range of robots to exchange their blocks of code.
Jean-Christophe Baillie, Chief Science and Technology Officer at ALDEBARAN Robotics and founder of Gostai: “Aldebaran has always believed in technologies built around URBI and integrates urbiscript into the list of programming languages compatible with NAO. Our longer term goal is for a new community to coalesce around URBI to promote exchanges and sharing, which will inevitably give a boost to the development of robot-dedicated applications.”
This news will strengthen our partnerships with academic communities and our private users, who will benefit from URBI related support.
The URBI suite is compatible with numerous programming languages such as JAVA, C, C++ and Python, which can be used to program ALDEBARAN Robotics robots.
Distribution under a BSD license will be effective with the next NAO SDK release.
Founded in 2005 by Bruno Maisonnier, and with offices in France, China and the United States, ALDEBARAN Robotics designs, produces and commercializes autonomous humanoid robots with the aim of contributing to the well-being of humans. Today, over 2 500 NAOs are utilized throughout the world as research and educational platforms in 50 countries. ALDEBARAN Robotics brings together more than 210 people – including 40% engineers and PhDs – that are involved in the development and production of the robots.
Researchers at the Vienna University of Technology (TU Vienna) have developed a 3D printing technology that can quickly print detailed objects in nanoscale using a process called two-photon lithography. It’s fast, too: the precision required to print objects with features measured in hundreds of nanometers in width meant the speed of previous attempts at printing nanoscale objects were measured in millimeters per second. In contrast, the TU Vienna team’s 3D printer is capable of printing lines of resin at a rate of five meters per second. In a demonstration shown in the video below, the team was able to print a nanoscale model of a 300-micrometer long Formula 1 racecar—made from 100 layers of resin, each consisting of approximately 200 individual lines—in four minutes.
A 330x130x100 micrometer race car, printed in four minutes (Vienna University of Technology video).
The new process, developed as part of the European Commision’s PhoCam program for developing “factories of the future,” could make it practical and affordable to print intricate nano-scale structures for use in microscopic machinery and medical applications. One of those is “scaffolds” for promoting the growth of custom-made living tissues from cells, giving cells a structure to stick to. “The technique already showed good applicability for fabricating 3D environments for cells,” TU Vienna researcher Jan Torgersen told Ars in an e-mail exchange about the research.
Torgensen added that since the two-photon process isn’t limited to printing in layers, but can draw lines in three dimensions, it can be used to embed and connect objects as well. For example, he said, the team has already successfully fabricated nanoscale optical waveguides into an existing electrical matrix. “These waveguides are very promising for various optoelectronic applications,” he said.
New (sort of) from Neato Robotics is the XV-21, an autonomous robot vacuum that’s been enhanced to provide better cleaning for people with an excess of pets. Besides being a solid XV-10 better than the XV-11, will the XV-21 and its new pet-optimized hardware really do a better job, and is it worth the premium you’ll pay? Being the robot vacuum experts that we are (or that we claim to be, anyway), we’ll tell you what we think.
The Office of Naval Research has announced that they're developing SAFFiR, a humanoid firefighting robot designed to operate aboard ships that looks not entirely unlike the robot in the picture above. And as you've probably already guessed from the rAnDoMLy weIRD caPITaLIZAtiON, it's going to be developed in partnership with Virginia Tech's RoMeLa, already famous (at least, around here) for their CHARLI humanoid.
HDT Global has just introduced some new robotic limbs to give explosive ordnance disposal (EOD) robots like PackBots and Talons [pictured above] a helping hand (or two) when it comes to complex and delicate tasks like defusing bombs. This is a very good idea, since just poking high explosives with a simple gripper doesn't always work outthe way everyone would like.
We're going 100 percent iRobot with today's post, 'cause they have a bunch of robotics news that all seemed to happen within the last week or so, and we want to make sure you're up to date: in addition to reorganization of the entire company, their remote presence robot Ava is learning some new tricks, and the U.S. Army has put in a big order for FirstLook reconnaissance robots. We've got details!
It's easy to argue that R2-D2 from Star Wars has more personality than many robots twice its size, but without a face or limbs to speak of, where does it all come from? The answer, of course, is sound. Now U.K. researchers are trying to do the same with real robots, teaching them to communicate information and emotions to humans using beeps, boops, and squeaks.
See this? It's a robot with bird legs. Genius. This, plus a manipulation arm from DARPA and a Darwin-OP humanoid getting learning to play Dance Dance Revolution, in this edition of our more or less reliable (it's less) video-filled Friday post.