Posted on

Track autonomous vehicle testing in your state with this new tool from the US government

The U.S. government rolled out a new online tool Wednesday designed to give the public insight into where and who is testing automated vehicle technology throughout the country.

The official name of the online tool — Automated Vehicle Transparency and Engagement for Safe Testing Initiative tracking tool — is a jargony mess of a word salad. Fortunately, its mechanics are straightforward. The online tool gives users the ability to find information about on-road testing of automated vehicles in 17 cities throughout the United States. The public can find out information about a company’s on-road testing and safety performance, the number of vehicles in its fleet as well as AV-related legislation or policy in specific states.

The AV tracking tool is part of the Automated Vehicle Transparency and Engagement for Safe Testing  Initiative, called AV TEST for short, that was announced in June. The National Highway Traffic Safety Administration is overseeing the AV TEST Initiative.

The online tool is hardly comprehensive, but it’s a start, and continues to expand. The tool currently shows data in 17 cities, including Austin, Columbus (Ohio), Dallas, Denver, Jacksonville, Orlando, Phoenix, Pittsburgh, Salt Lake City, San Francisco and Washington, D.C. The data might include testing activity as well as dates, frequency, vehicle counts and routes, NHTSA said.

The information on the interactive web page is based on information that companies have volunteered. In other words, companies testing automated vehicle technology are not required by the federal government to provide data.

However, a growing number of AV founders and engineers understand that public education and acceptance will be necessary if they ever hope to commercially deploy their technology. Ten companies and nine states have already signed on as participants in the voluntary web pilot. The participating companies, to date, are Beep, Cruise, EasyMile, FCA, LM Industries, Navya, Nuro, Toyota, Waymo and Uber Advanced Technologies Group. The online tool also contains voluntarily submitted safety reports from Aurora, Ike, Kodiak, Lyft, TuSimple and Zoox.

NHTSA has limited the number of companies submitting data during the pilot phase, Dr. Joseph M. Kolly, the agency’s chief safety scientist said during a briefing earlier Wednesday.

“The more information the public has about the on-road testing of automated driving systems, the more they will understand the development of this promising technology,” NHTSA Deputy Administrator James Owens said in a statement. “Automated driving systems are not yet available for sale to the public, and the AV TEST Initiative will help improve public understanding of the technology’s potential and limitations as it continues to develop.”

Read More

Posted on

Tesla Autopilot System Found Probably at Fault in 2018 Crash

WASHINGTON — Tesla’s Autopilot driver-assistance system and a driver who relied too heavily on it are likely to blame for a 2018 crash in California in which the driver died, a federal safety agency said on Tuesday.

The agency, the National Transportation Safety Board, criticized several institutions for failing to do more to prevent the crash, including the National Highway Traffic Safety Administration for what some board members described as a hands-off approach to regulating automated-vehicle technology.

“We urge Tesla to continue to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken when necessary,” Robert L. Sumwalt, the board’s chairman, said. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

The board adopted a number of staff findings and recommendations from an investigation into the crash that began more than six months ago. The findings included the determination that Autopilot failed to keep the driver’s vehicle in the lane, that its collision-avoidance software failed to detect a highway barrier and that the driver was probably distracted by a game on his phone.

The board also determined that the driver, Wei Huang, most likely would have survived had the California Transportation Department fixed the barrier he hit, which was designed to absorb some of the impact of a collision but was damaged during a previous crash.

Mr. Sumwalt also said Tesla had not responded to two recommendations the safety board made to the electric-car company and five other automakers in 2017. The board told the companies to limit use of automated systems to the conditions for which they were designed and to better monitor drivers to make sure they remain focused on the road and have their hands on the wheel.

“It’s been 881 days since these recommendations were sent to Tesla and we’ve heard nothing,” he said. “We’re still waiting.”

Tesla did not respond to requests for comment about criticism of Autopilot.

In a statement, the National Highway Traffic Safety Administration said that all crashes caused by distracted driving, including those in which driver-assistance systems were in use, were a “major concern” and that it planned to review the board’s report.

The board’s conclusion is the latest development in a string of federal investigations into crashes involving Autopilot, which can, among other things, keep a moving car in its lane and match the speed of surrounding vehicles. Tesla has said that the system should be used only under certain conditions, but some safety experts say the company doesn’t do enough to educate drivers about those limitations or take steps to make sure drivers do not become overly reliant on the system and, thus, distracted.

Mr. Huang had been playing a game on his phone during the drive, but it was not clear whether he was engaged with the game in the moments before the crash, according to the investigation.

The concerns about Autopilot have done little to slow Tesla’s rise. The company’s share price has more than tripled since October as Tesla’s financial performance has surpassed even the rosiest of analyst expectations. In September, Tesla earned its first safety award from the nonprofit Insurance Institute for Highway Safety, and last week Consumer Reports named Tesla’s first mass-market electric car, the Model 3, one of its top picks for 2020.

Tesla has repeatedly said that Autopilot makes its vehicles safer. In the fourth quarter of 2019, the company reported one accident for every three million miles driven in a Tesla with Autopilot engaged. Over all, the national rate was one accident for every 498,000 miles driven in 2017, according to NHTSA.

Still, the electric-car maker faces scrutiny on multiple fronts. The N.T.S.B. and the traffic safety administration are currently investigating more than a dozen crashes in which Autopilot might have played a role.

In the 2018 accident, Autopilot had been engaged for nearly 19 minutes, according to the safety board’s investigation. Mr. Huang put his hands on and off the wheel several times during that period, and in the final minute before the crash, the vehicle detected his hands on the wheel three times for a total of 34 seconds. It did not detect his hands on the wheel in the six seconds before impact.

Tesla’s event data recorders routinely collect a wide variety of information, such as location, speed, seatbelt status, the position of the driver’s seat, the rotation angle of the steering wheel and pressure on the accelerator pedal.

Mr. Huang had been traveling in his 2017 Tesla Model X sport utility vehicle on U.S. 101 in Mountain View when the car struck a median barrier at about 71 miles per hour. The speed limit was 65 m.p.h. The impact spun the car, which later hit two other vehicles and caught fire.

Mr. Huang, who worked at Apple, had previously complained to family of problems with Autopilot along that stretch of highway near State Route 85, his brother told investigators. Data from the vehicle confirmed at least one similar episode near the area dividing the two highways, according to documents from the investigation.

The safety board called on Apple to ban the nonemergency use of company-issued devices while driving. It also called on Apple and other electronics companies to either lock people out of their devices or limit what they can do with the devices while driving.

The first known fatal crash with Autopilot in use occurred in May 2016 in Florida, when a Tesla failed to stop for a truck that was turning in front of it on a Florida highway. The vehicle hit the trailer, continued traveling underneath it and veered off the road. The driver of that car, Joshua Brown, was killed in the accident.

Both the N.T.S.B. and the traffic safety agency investigated that crash, but came to somewhat different conclusions. In January 2017, NHTSA cleared Autopilot, finding that it had no defects and did not need to be recalled, though the agency called on automakers to clearly explain how such systems work to drivers. Nine months later, the safety board determined that while Autopilot worked as intended, it had nonetheless “played a major role” in the crash.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” Mr. Sumwalt said at the time.

That finding reflects a common critique of Autopilot — that it does not go far enough in forcing drivers to maintain their focus on the road. Unlike Autopilot, Super Cruise, a driver-assistance system offered by General Motors, works only on certain highways and tracks drivers’ heads to make sure they are paying attention to the road.

Critics also say Tesla and its chief executive, Elon Musk, have exaggerated Autopilot’s capabilities.

In 2018, for example, Mr. Musk was widely criticized for taking his hands off a Tesla Model 3 steering wheel while demonstrating Autopilot for the CBS News program “60 Minutes,” something the vehicle owner’s manual instructs drivers using Autopilot never to do.

In January, Mr. Musk told investors that Tesla’s “full self-driving capability” might be just a few months from having “some chance of going from your home to work, let’s say, with no interventions.”

Jason Levine, executive director of the Center for Auto Safety, an advocacy group, said that “by calling it Autopilot, by using terms like ‘full self-driving,’ Tesla is intentionally misleading consumers as to the capabilities of the technology.”

To avoid false expectations, German regulators reportedly asked Tesla in 2016 to stop using the term Autopilot, arguing that it suggests that the technology is more advanced than it really is.

Autonomous technology is commonly categorized into six levels, from zero to five, as defined by SAE International, an association of automotive engineers. Level 5 represents full autonomy in which a vehicle can perform all driving functions on its own, including navigating to a chosen destination. Autopilot and Super Cruise are considered Level 2 “partial automation” technologies, which enable a vehicle to control steering and braking and accelerating yet require the full attention of a human driver.

Evidence of drivers misusing Autopilot abounds on the internet. And in a survey last year, the Insurance Institute for Highway Safety found that 48 percent of drivers believed it was safe to remove their hands from a steering wheel while using Autopilot. By comparison, 33 percent or fewer drivers said the same thing about similar systems in cars made by other automakers.

Posted on

Tesla calls claims of unintended acceleration in NHTSA petition ‘completely false’

Tesla pushed back Monday against claims that its electric vehicles may suddenly accelerate on their own, calling a petition filed with federal safety regulators “completely false.”

Tesla also questions the validity of the petition, noting that it was submitted by a Tesla short-seller.

Last week, the National Highway Traffic and Safety Administration said it would review a defect petition that cited 127 consumer complaints of alleged unintended acceleration of Tesla electric vehicles that may have contributed to or caused 110 crashes and 52 injuries.

The petition, which was first reported by CNBC, was filed by Brian Sparks, an independent investor who is currently shorting Tesla’s stock. Sparks has hedged his bets and has been long Tesla in the past, according to the CNBC report.

At the time, Tesla didn’t respond to requests for comment. Now, in a blog post, the company said that it routinely reviews customer complaints of unintended acceleration with NHTSA.

“In every case we reviewed with them, the data proved the vehicle functioned properly,” Tesla wrote in a blog post on its website.

The automaker argued that its vehicles are designed to avoid unintended acceleration, noting that its system will default to cutting off motor torque if the two independent position sensors on its accelerator pedals register any error.

“We also use the Autopilot sensor suite to help distinguish potential pedal misapplications and cut torque to mitigate or prevent accidents when we’re confident the driver’s input was unintentional,” the company wrote.

Here is the complete response from Tesla:

This petition is completely false and was brought by a Tesla short-seller. We investigate every single incident where the driver alleges to us that their vehicle accelerated contrary to their input, and in every case where we had the vehicle’s data, we confirmed that the car operated as designed. In other words, the car accelerates if, and only if, the driver told it to do so, and it slows or stops when the driver applies the brake.

While accidents caused by a mistaken press of the accelerator pedal have been alleged for nearly every make/model of vehicle on the road, the accelerator pedals in Model S, X and 3 vehicles have two independent position sensors, and if there is any error, the system defaults to cut off motor torque. Likewise, applying the brake pedal simultaneously with the accelerator pedal will override the accelerator pedal input and cut off motor torque, and regardless of the torque, sustained braking will stop the car. Unique to Tesla, we also use the Autopilot sensor suite to help distinguish potential pedal misapplications and cut torque to mitigate or prevent accidents when we’re confident the driver’s input was unintentional. Each system is independent and records data, so we can examine exactly what happened.

We are transparent with NHTSA, and routinely review customer complaints of unintended acceleration with them. Over the past several years, we discussed with NHTSA the majority of the complaints alleged in the petition. In every case we reviewed with them, the data proved the vehicle functioned properly.

Source: TechCrunch