-

Tesla asks fully autonomous testers for permission to collect footage in the event of an accident | Today Nation News

TechnologyTesla asks fully autonomous testers for permission to collect...

With Tesla‘s latest FSD (Full Self-Driving), the company collects footage taken by cameras outside and inside the vehicle in the event of an accident or “serious safety risk.” We are asking the driver to agree to do so. Electrek reports that this is the first time the company has asked specific vehicles and drivers to record video.

Tesla has previously collected footage as part of the FSD, but it was only used to train and improve AI autonomous driving systems. However, the new deal will allow the company to associate footage with specific vehicles. “By enabling FSD Beta, I have a VIN (Vehicle Identification Number) related image from Tesla‘s external and cabin cameras in the event of a serious safety risk or safety incident such as a collision. I agree to collect the data, “the contract states.

By enabling FSD Beta, I will have Tesla collect VIN-related image data from the vehicle’s external and cabin cameras in the event of a serious safety risk or safety incident such as a collision. I agree with you.

As Electrek points out, this wording may indicate that Tesla wants to secure evidence in case the FSD system is the cause of the accident. It may also be used to detect and fix serious problems more quickly.

FSD 10.3 was released more extensively than previous beta versions, but was quickly withdrawn due to issues such as unjustified forward collision warnings and unexpected automatic braking. At the time, CEO Elon Musk tweeted that such issues were “expected with beta software,” and “testing all hardware configurations with an in-house QA under all conditions is not possible. It’s impossible, so we’ll do a public test. “

READ MORE  Interest in EVs spread to electric boats, Arc raised about 3.45 billion yen | Today Nation News

However, other drivers on public roads are also unknowingly becoming beta testers. The U.S. Road Traffic Safety Agency is currently investigating the driver’s allegations that the FSD caused a collision in the accident that occurred in Brea, California on November 3. The owner claims that the FSD caused Model Y to enter the wrong lane and collide with another car, causing great damage to both sides.

Tesla will release a new beta to more users with a driver safety score of 98 or higher. So far, beta releases have been limited to drivers with a perfect score of 100. The company charges drivers $ 199 a month (about 23,000 yen) or a lump sum of $ 10,000 (about 1.15 million yen) to use this function, but the promised deadline for autonomous driving I couldn’t protect. Currently, the FSD system is considered Level 2 and is far from Level 4 required for “fully automated driving”.

Editor’s Note: This article first appeared on Engadget. The author, Steve Dent, is a co-editor of Engadget.

Image credit: Tesla

[To the original text]

(Sentence: Steve Dent, Translation:saurabh

.

Latest news

You might also likeRELATED
Recommended to you