Showing posts with label driverless car. Show all posts
Showing posts with label driverless car. Show all posts

Tuesday, December 22, 2015

"If Two Cars Crash and No One Is Driving Them, Does It Make a Sound? Yes: Ka-Ching!"

Bloomberg: Imagine a robot car with no one behind the wheel hitting another driverless car. Who’s at fault?
The answer: No one knows. But plaintiff’s lawyers are salivating at the prospects for big paydays from such accidents. If computers routinely crash, they say, then so will cars operated by them. And with no one behind the wheel, lawyers say they can go after almost anyone even remotely involved.
“You’re going to get a whole host of new defendants,” said Kevin Dean, who is suing General Motors Co. over its faulty ignition switches and Takata Corp. over air-bag failures. “Computer programmers, computer companies, designers of algorithms, Google, mapping companies, even states. It’s going to be very fertile ground for lawyers.”

Monday, December 21, 2015

When Following The Law is Dangerous

"As it turns out, humans are kind of terrible at that. Which is a real problem for robot-cars."
One of the biggest obstacles currently facing researchers is the fact that driverless cars are engineered to always follow the law. So human drivers, who obviously don’t do the same, keep crashing into them when they’re “moving too slow” — AKA actually doing the speed limit.
As a result, according to a recent report from Bloomberg, driverless cars are now seeing a crash rate twice as high as cars with humans at the wheel. The report notes that they’re all “minor scrape-ups for now”, that they’re always the human-driver’s fault (usually human drivers hit the slower-moving computer-driven cars from behind), and that none of these accidents have caused any injuries.
But now researchers have to decide whether driverless cars should be taught how to break the law in small ways — like humans so often do — in order to make sure that they can safely do things like merge with high-speed highway traffic. Which gets into some murky ethical territory.

Saturday, December 5, 2015

"Driverless cars could spell the end for domestic flights, says Audi strategist"

"In the future you will not need a business hotel or a domestic flight," Schuwirth told Dezeen. "We can disrupt the entire business of domestic flights."

Wednesday, September 2, 2015

"Google’s Driverless Cars Run Into Problem: Cars With Drivers"

"Last month, as one of Google’s self-driving cars approached a crosswalk, it did what it was supposed to do when it slowed to allow a pedestrian to cross, prompting its “safety driver” to apply the brakes. The pedestrian was fine, but not so much Google’s car, which was hit from behind by a human-driven sedan."
Google’s fleet of autonomous test cars is programmed to follow the letter of the law. But it can be tough to get around if you are a stickler for the rules. One Google car, in a test in 2009, couldn’t get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google’s robot.
It is not just a Google issue. Researchers in the fledgling field of autonomous vehicles say that one of the biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book. “The real problem is that the car is too safe,” said Donald Norman, director of the Design Lab at the University of California, San Diego, who studies autonomous vehicles.
“They have to learn to be aggressive in the right amount, and the right amount depends on the culture.”