Category Archives: Robot

The new technology a self-driving car still needs development

This accident, the first caused by a self-driving car after nearly 2.4 million kilometres of autonomous driving, suggests that Google’s vehicles still need more common sense. A human driver would probably have just driven over the sandbags.

Researchers at the University of La Laguna in the Canary Islands are building a system which might have helped Google’s Lexus see the sandbags for the non-threat they were. They use Microsoft Kinect cameras, originally developed for the Xbox One gaming console, to improve self-driving cars’ obstacle avoidance at close range.

Self-driving cars typically use a combination of sensors to detect and avoid obstacles. Radar and laser-based lidar systems are generally used for objects at long range, while ultrasonic detectors and stereo cameras sense cars and pedestrians closer in.

Obstacles close to the ground like ramps, kerbs and sandbags are difficult to make out, says Javier Hernandez-Aceituno, lead author of the study: “Laser-based sensors are not suitable for this task because they detect ramps as obstacles. Ultrasonic sensors are also unsuitable due to their low precision.”

Hernandez-Aceituno decided to try using a Kinect, a depth-sensing camera that uses an infrared laser to capture an instantaneous 3D map of objects up to about 4 metres away.

Last year, he and his colleagues installed a Kinect on an experimental golf cart called Verdino. A low-speed self-driving vehicle, Verdino is also equipped with laser rangefinders and stereo cameras from a PlayStation 4.

They set the Verdino loose on an outdoor course with ramps, kerbs and stairs, using one obstacle detection program to process data from all the sensors.

Misjudged ramp
The laser rangefinder ignored the lowest steps and incorrectly decided that the ramp was too steep to navigate. The camera gave inaccurate results for very near and far obstacles, and suffered from false detections.

The Kinect, however, produced more accurate results and fewer false positives than the stereo camera. Its biggest problem was spurious obstacles created by reflections of sunlight, although Hernandez-Aceituno was able to filter these out.

“The Kinect sensor vastly outperforms stereo vision at accurately detecting obstacles on the road,” says Hernandez-Aceituno, “[and] allows an autonomous vehicle to navigate safely in areas where laser rangefinders cannot detect obstacles.”

New Atlas – The latest robot technology

The humanoid Atlas robot, which has been overhauled with a sleeker design, can be seen at the beginning of the video walking around untethered before it opens the front door to Boston Dynamics’ office and steps outside. The bot is then seen walking on uneven and snowy terrain, maneuvering around trees and correcting its balance several times. [Watch the Atlas Robot Video]

The new-and-improved robot is “designed to operate outdoors and inside buildings,” Boston Dynamics wrote in a description of the video posted on YouTube. “It is specialized for mobile manipulation. It is electrically powered and hydraulically actuated. It uses sensors in its body and legs to balance and LIDAR and stereo sensors in its head to avoid obstacles, assess the terrain, help with navigation and manipulate objects.”

Indeed, the video goes on to show Atlas bending down to pick up 10-pound (4.5 kilograms) boxes and pivoting its torso to place each package on a shelf. In another instance, a human handler uses a hockey stick to push Atlas off balance. The robot stumbles backwards (but catches itself) before regaining its balance. Next, an employee pushes Atlas down from behind. The curled-up robot (lying flat on its robotic face) is able to push itself up — first to its “hands” and “knees,” before righting its torso and then pushing up on its feet— all without help from a human or a tether. [Robots on the Run! 5 Bots That Can Really Move]

Some commenters on the YouTube video expressed outrage at the guy pushing the robot with a hockey stick, with some saying they felt sad for the robot, some calling the guy a bully and even suggesting, perhaps with a grin, that he will be blamed for any robot uprisings.

“The guy who kicks the robot will be fully responsible [sic] from the forthcoming robot-human wars,” wrote Alper ALT.

Another commenter, jonelolguy, wrote: “Man, i actually feel bad for the robot.”

“Did anyone else feel pretty sad when they pushed it,” wrote Cris Loreto.

These commenters aren’t alone in attributing feelings to robots, particularly ones that look lifelike.

Researchers have found that when people watch a robot being harmed or snuggled they react in a similar way to those actions being done to a flesh-and-blood human. In one study, participants said they felt negative emotions when they watched a human hit or drop a small dinosaur robot, and their skin conductance also showed they were distressed at the “bot abuse.” When volunteers watched a robot being hugged their brain activity was the same as when they watched human-human affection; even so, brain activity was stronger for human-human abuse versus human-robot violence.

“We think that, in general, the robot stimuli elicit the same emotional processing as the human stimuli,” said Astrid Rosenthal-von der Pütten of the University of Duisburg Essen in Germany, who led that study. The research was presented in 2013 at the International Communication Association Conference in London.

Last summer, Boston Dynamics upgraded the Atlas robot for the DARPA Robotics Challenge Finals, a competition hosted by the U.S. military’s Defense Advanced Research Projects Agency. The most significant changes at that time were to Atlas’ power supply and hydraulic pump, which helps the robot stand, walk around and perform other tasks.

Boston Dynamics, which is owned by Google, said the new version of the Atlas robot now stands about 5 feet and 9 inches (1.7 meters) tall, which is about a head shorter than the version of Atlas used in the DARPA Robotics Challenge Finals, and weighs 180 pounds (82 kg).

Source : http://www.livescience.com

Software robotic enter the banking system

There is a whole new range of companies like Google, Apple and others starting to provide financial services focused on the most profitable products and services currently being provided by banks. Banks are experiencing new and disruptive challenges to their traditional business models and product offerings.

Banks are responding by offering similar services in a fashion now being expected by consumers, but they are facing huge challenges in adapting their existing technologies, legacy systems and business architectures. This is where Banking Robotic Software is making rapid change possible.

They traditional response to these new competitors would be to replace legacy IT systems with state of the art technologies. Unfortunately, replacing these core systems would take years and enormous amounts of capital and operational risk. An option banks are increasingly taking is to augment existing systems by applying Banking Robotic Software. This approach allows new products and services to be offered through new customer interfaces, with Robots processing the various transactions through the user interface of existing legacy applications. This approach would be less feasible if humans were required, however it becomes practical because Robots work at twice the speed of humans, 24 hours a day and 7 days a week. Since these Robots can work on virtual computers, they do not require any of the infrastructure that humans require and are not susceptible to the attendance, coaching and other management overheads required by humans.

Banking Robotic Software can be also be deployed much faster than large IT projects. Robots can be configured and trained on timelines similar to training a new staff person. However, once one Robot has been trained, all other Robots immediately receive the same training.

Banks are also applying Banking Robotic Software to their other business lines as a means of reducing cost and improving customer service. At a cost of less than $7.5K per year, Robots are also enabling companies to repatriate work from business process outsourcers, resulting in further cost savings and improved customer service.

 Insurance Robotic Software

Insurance Robotic Software is helping reduce cost, mitigate risks and improve customer service. Insurance policies are evolving rapidly to accommodate new situations and technologies, while the need remains to support legacy policies. In the past, companies relied on humans to retain knowledge and experience to these support legacy policies, however todays employment market is seeing a much higher turn-over of staff and the resultant risks associated with losing staff experienced in administering legacy policies.

Robots can be trained in very complex business logic and are therefore able to process claims across a full range of policies, quickly and efficiently, with no human errors. They also eliminate the risks of staff turnover.