How Google’s self-driving cars see the world – Tech Insider

Standard

Google’s self-driving cars are traveling more naturally than they ever have before.Decked out with GPS, sensors, cameras, radar, and lasers, Alphabet’s (Alphabet is the new parent company of Google) cars are capable of gathering tons of data about their environment from a 360 degree perspective so that they can seamlessly operate in a constantly changing environment.According to Google’s Self-Driving Car Project website, sensors on the car can detect objects up to two football fields away, including people, vehicles, construction zones, birds, cyclists, and more.But the data collected by each vehicle does more than allow it to respond in the moment. All of the data each car collects is used to constantly improve the software, so that all cars can learn from one vehicle’s experience. Given that Google’s self-driving cars have driven more than 1.2 million miles in autonomous mode since 2009, the software knows how to react in a lot of different situations.Chris Urmson, the head of technical development for the project, gave a thorough look at how its cars are operating in real-life scenarios in June during a Ted Talk presentation.

Source: How Google’s self-driving cars see the world – Tech Insider

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s