YOU CAN’T SEE the bunny, but the picosecond laser certainly can. In a lab at Stanford, engineers have set up a weird contraption, hiding a toy bunny behind a T-shaped wall.
It’s not every week that a major industry player places its trust in a 22-year-old college drop-out claiming to have come up with game-changing technology; but that’s exactly what happened this week.
The most exciting innovation in vision these days has nothing to do with our eyes, but rather with our cars. In California’s Silicon Valley, two veterans of Apple’s Special Projects Group named Soroush Salehian and Mina Rezk have founded a startup known as Aeva.
EARLY NEXT YEAR, a Boeing 777 will take off from the company’s airfield near Seattle with a laser shooting out of its nose.
YOU NEED JUST two eyes and two ears to drive. Those remarkable sensors provide all the info you need to, say, know that a fire engine is coming up fast behind you, so get out of the way. Autonomous vehicles need a whole lot more than that.
THE BIGGEST CHALLENGE in building an autonomous vehicle is giving the car the ability to see the world. It requires a thorough understanding of lidar, the radar-like system of lasers that creates the digital map each car needs to navigate the world safely and competently.
EXPERIMENTAL self-driving cars continue to make regular forays onto the roads. After a trial in Pittsburgh, Uber launched several of its “autonomous” vehicles onto the streets of San Francisco on December 14th—and promptly ran into a row with officials for not obtaining an operating permit, which Uber insists is unnecessary as the vehicles have a backup driver to take over if something goes wrong.
Last week Velodyne raised $150 million in funding to develop better laser imaging technology (lidar) for driverless cars. Velodyne’s lidar works with cameras and radar to gather data about the vehicle’s surroundings and keep an eye on traffic and pedestrians.