Tesla Autopilot


Tesla Autopilot is a suite of advanced driver-assistance system features offered by Tesla that has lane centering, traffic-aware cruise control, self-parking, automatic lane changes, semi-autonomous navigation on limited access freeways, and the ability to summon the car from a garage or parking spot. In all of these features, the driver is responsible and the car requires constant supervision. The company claims the features reduce accidents caused by driver negligence and fatigue from long-term driving.
As an upgrade to the base Autopilot capabilities, the company's stated intent is to offer full self-driving at a future time, acknowledging that legal, regulatory, and technical hurdles must be overcome to achieve this goal.
As of April 2019, most experts believe that Tesla vehicles lack the necessary hardware for full self-driving. In April 2020, Tesla was ranked last by Navigant Research for both strategy and execution in the autonomous driving sector.

History

Elon Musk first discussed the Autopilot system publicly in 2013, noting "Autopilot is a good thing to have in planes, and we should have it in cars.".
All Tesla cars manufactured between September 2014 and October 2016 had the initial hardware that supported Autopilot. On October 9, 2014, Tesla offered customers the ability to pre-purchase Autopilot capability within a "Tech Package" option. At that time Tesla stated Autopilot would include semi-autonomous drive and parking capabilities, and was not designed for self-driving.
Initial versions of Autopilot were developed in partnership with Israeli company Mobileye. Tesla and Mobileye ended their partnership in July 2016.
Software enabling Autopilot was released in mid-October 2015 as part of Tesla software version 7.0. At that time, Tesla announced its goal to offer self-driving technology. Software version 7.1 then removed some features to discourage customers from engaging in risky behavior and added the Summon remote parking capability that can move the car forward and backward under remote human control without a driver in the car.
In March 2015, speaking at an Nividia conference, Elon stated:
In December 2015, Musk predicted "complete autonomy" by 2018.
On August 31, 2016, Elon Musk announced Autopilot 8.0, which processes radar signals to create a coarse point cloud similar to LiDAR to help navigate in low visibility, and even to 'see' in front of the car ahead of the Tesla car. In November 2016, Autopilot 8.0 was updated to have a more noticeable signal to the driver that it is engaged and to require drivers to touch the steering wheel more frequently. By November 2016, Autopilot had operated actively on HW1 vehicles for 300 million miles and 1.3 billion miles in "shadow" mode.
Tesla states that as of October 2016, all new vehicles come with the necessary sensors and computing hardware, known as hardware version 2, for future full self-driving. Tesla uses the term "Enhanced Autopilot" to refer to HW2 capabilities that were not available in hardware version 1, which include the ability to automatically change lanes without requiring driver input, transition from one freeway to another, and exit the freeway when your destination is near.
Autopilot software for HW2 cars came in February 2017. It included traffic-aware cruise control, autosteer on divided highways, autosteer on 'local roads' up to a speed of 35 mph or a specified number of mph over the local speed limit to a maximum of 45 mph. Software version 8.1 for HW2 arrived in June 2017, adding a new driving-assist algorithm, full-speed braking and handling parallel and perpendicular parking. Later releases offered smoother lane-keeping and less jerky acceleration and deceleration.
HW2.5 appeared in cars built from August 2017.
In March or April 2019, Tesla began fitting a new version of the "full self-driving computer" which has two Tesla-designed microprocessors.
In April 2019, Tesla started releasing an update to Navigate on Autopilot, which does not require lane change confirmation, but does require the driver to have hands on the steering wheel. The car will navigate freeway interchanges on its own, but the driver needs to supervise. The ability is available to those who have purchased Enhanced Autopilot or Full Self-Driving Capability.
In May 2019, Tesla provided an updated Autopilot in Europe, to comply with new UN/ECE R79 regulation related to Automatically commanded steering function.
In September 2019, Tesla released software version 10 to early access users. This software featured improvements in driving visualization and automatic lane changes.
In February 2019, Elon Musk said that he thought Tesla's Full Self Driving capability would be "feature complete" by the end of 2019. In January 2020, Musk pushed back his projection to be feature complete to the end of 2020, and added that feature complete "doesn't mean that features are working well."
In April 2020, Tesla released a "beta" feature to recognize and respond to stop signs and traffic lights.
Industry experts have said that Musk and Tesla's autonomous driving plans lack credibility, and their goal of achieving level 5 autonomy by the end of 2020 isn't feasible.

Driving features

As of 2019, Autopilot was recommended for use on interstate highways only, although it functioned on some city streets and highways. The features included self-driving functionality classified as level 2 on a scale from 0 to 5.
Autopilot is only designed to be used on limited-access highways.
Tesla requires operators to monitor and remain responsible for the vehicle at all times, including when Autopilot is enabled.

HW1, HW2, HW3 and all listed software versions

Software updates

Autopilot-enabled cars receive Autopilot software updates wirelessly, as part of recurring Tesla software updates.

Safety features

If Autopilot detects a potential front or side collision with another vehicle, bicycle or pedestrian within a distance of, it sounds a warning. Autopilot has automatic emergency braking that detects objects that may hit the car and applies the brakes. The car may also automatically swerve out of the way to prevent a collision.

Visualization

Autopilot includes a video display of some of what it sees around it. It displays driving lanes and vehicles in front, behind and on either side of it. It also displays lane markings and speed limits. On HW3, it displays stop signs and traffic signals. It distinguishes pedestrians, bicyclists/motorcyclists, small cars, and larger SUVs/trucks.

Speed assist

Front-facing cameras detect speed limit signs on HW1 vehicles and display the current limit on the dashboard or center display. Limits are compared against GPS data if no signs are present or if the vehicle is HW2 or HW2.5.

Traffic-aware cruise control

Also known as adaptive cruise control, traffic-aware cruise control has the ability to maintain a safe distance from the vehicle in front of it by accelerating and braking as that vehicle speeds up and slows down. It also slows on tight curves, on interstate ramps, and when a car enters or exits the road in front of it. It can be enabled at any speed between 0 mph and 90 mph. By default, it sets the limit at the [|current speed limit] plus/minus a driver-specified offset, then adjusts its target speed according to changes in speed limits. If road conditions warrant, Autosteer and cruise control disengage and an audio and visual signal indicate that the driver must assume full control.

Autosteer (Beta)

steers the car to remain in whatever lane it is in. It is able to safely change lanes when the driver taps the turn signal stalk. HW2 was limited to on highway roads or to five miles over the speed limit or on non-divided highways. If the driver ignores three audio warnings about controlling the steering wheel within an hour, Autopilot is disabled until the car is restarted.

HW1, HW2, HW3 with advanced software versions

Autopark

Autopark parks the car in perpendicular or parallel parking spaces, with either nose or tail facing outward.

Summon (Beta)

Basic summon is a feature enabled in 2016 which moves car into and out of a tight space using the Tesla phone app or key fob.

HW2, HW3 with advanced software versions

Navigate on Autopilot (Beta)

HW2 and later vehicles with Autosteer can execute automatic lane changes, move to a more appropriate lane based on speed, exit a freeway, and navigate freeway interchanges.

Smart Summon (Beta)

Smart summon is a feature enabled in 2019 and enables supervised 150-foot line-of-sight remote car retrieval using the Tesla phone app.

HW3 with advanced software versions

Traffic Light and Stop Sign Control

When using Traffic-Aware Cruise Control or Autosteer, this feature recognizes and responds to traffic lights and stop signs. This feature will slow and eventually stop for stop signs and all detected traffic lights. To continue through a green light, the user must push down the gear selector once or briefly press the accelerator pedal.

Full self-driving capability

Tesla has made claims since 2016 that all new vehicles have the hardware necessary for full self driving; however free hardware upgrades have been required. Tesla is claiming that the current software will be upgraded to provide full self driving without any need for additional hardware.
There is controversy over whether LiDAR is needed for full self-driving capability. Tesla vehicles don't have LiDAR. Some news reports in 2019 state "practically everyone views as an essential ingredient for self-driving cars" and "experts and proponents say it adds depth and vision where camera and radar alone fall short." However, Elon Musk has called LiDAR "stupid, expensive and unnecessary", and in 2019 researchers at Cornell University "sing two inexpensive cameras on either side of a vehicle’s windshield... have discovered they can detect objects with nearly LiDAR’s accuracy and at a fraction of the cost."

Technical Specifications

Summary

;Notes

Hardware 1

Vehicles manufactured after late September 2014 are equipped with a camera mounted at the top of the windshield, forward looking radar in the lower grille and ultrasonic acoustic location sensors in the front and rear bumpers that provide a 360-degree view around the car. The computer is the Mobileye EyeQ3. This equipment allows Model S to detect road signs, lane markings, obstacles, and other vehicles.
Auto lane change can be initiated by the driver turning on the lane changing signal when safe, and then the system completes the lane change.
Upgrading from Hardware 1 to Hardware 2 is not offered as it would require substantial work and cost.

Hardware 2

HW2, included in all vehicles manufactured after October 2016, includes an NVIDIA Drive PX 2 GPU for CUDA based GPGPU computation. Tesla claimed that HW2 provided the necessary equipment to allow full self-driving capability at SAE Level 5. The hardware includes eight surround cameras and 12 ultrasonic sensors, in addition to forward-facing radar with enhanced processing capabilities. The Autopilot computer is replaceable to allow for future upgrades. The radar is able to observe beneath and ahead of the vehicle in front of the Tesla; the radar can see vehicles through heavy rain, fog or dust. Tesla claimed that the hardware was capable of processing 200 frames per second.
When "Enhanced Autopilot" was enabled in February 2017 by the v8.0 software update, testing showed the system was limited to using one of the eight onboard cameras—the main forward-facing camera The v8.1 software update released a month later enabled a second camera, the narrow-angle forward-facing camera.

Hardware 2.5

In August 2017, Tesla announced that HW2.5 included a secondary processor node to provide more computing power and additional wiring redundancy to slightly improve reliability; it also enabled dashcam and sentry mode capabilities.

Hardware 3

According to Tesla's director of Artificial Intelligence Andrej Karpathy, as of Q3 2018, Tesla has developed large neural networks for Autopilot that cannot be used due to the lack of computational resources in current Tesla vehicles. HW3 provided the resources to allow for improved accuracy in predictions.
HW3 includes a custom Tesla-designed system on a chip. Tesla claimed that the new system would process 2,300 frames per second, which is a 21x improvement in image processing compared to HW2.5, which is capable of 110 fps. The firm described it as a "neural network accelerator". The company claimed that HW3 was necessary for "full self-driving", but not for "enhanced Autopilot" functions.
The first availability of HW3 was April 2019. Elon Musk stated that customers who purchased the Full Self-Driving package will be eligible for an upgrade to HW3 without cost.
Tesla claims HW3 has 2.5x improved performance over HW2.5 with 1.25x higher power and 0.8x lower cost. HW3 features twelve ARM Cortex-A72 CPUs operating at 2.6 GHz, two Neural Network Accelerators operating at 2 GHz and a Mali GPU operating at 1 GHz.

Public debate

Regulation and liability

Some industry experts have raised questions about the legal status of autonomous driving in the U.S. and whether Tesla owners would violate current state regulations when using the Autopilot function. The few states that have passed laws allowing autonomous cars on the road limit their use for testing purposes, not for use by the general public. Also, there are questions about the liability for autonomous cars in case there is a mistake. A Tesla spokesman said there is "nothing in our autopilot system that is in conflict with current regulations." "We are not getting rid of the pilot. This is about releasing the driver from tedious tasks so they can focus and provide better input." Google's director of self-driving cars said he does not think there is a regulatory block as long as the self-driving vehicle meets crash-test and other safety standards. A spokesman for the U.S. National Highway Traffic Safety Administration said that "any autonomous vehicle would need to meet applicable federal motor vehicle safety standards" and the NHTSA "will have the appropriate policies and regulations in place to ensure the safety of this type of vehicles."
Tesla's Autopilot with HW1 can be classified as somewhere between levels 2 and 3 under the U.S. Department of Transportation’s National Highway Traffic Safety Administration five levels of vehicle automation. At this level, the car can act autonomously, but requires the driver to be prepared to take control at a moment's notice. HW1 is suitable only on limited-access highways, and sometimes will fail to detect lane markings and disengage itself. In urban driving the system will not read traffic signals or obey stop signs. This system also does not detect pedestrians or cyclists, and while HW1 detects motorcycles, there has been two instances of HW1 cars rear-ending motorcycles.
Tesla's Autopilot was the subject of a class action suit brought in 2017 that claimed the second-generation Enhanced Autopilot system was "dangerously defective." The suit was settled in 2018; owners who had paid in 2016 and 2017 to equip their cars with the updated Autopilot software were compensated between US$20 and US$280 for the delay in implementing Autopilot 2.0.
In July 2020, a German court ruled that Tesla made exaggerated promises about its Autopilot technology, and that the "Autopilot" name created the false impression that the car can drive itself.

Safety concerns

The National Transportation Safety Board has cited Tesla's Autopilot as a probable cause in at least three deadly crashes. The NTSB has criticized Tesla for failing to foresee and prevent "predictable abuse" of Autopilot.. The Center for Auto Safety and Consumer Watchdog have called for federal and state investigations into Autopilot and Tesla's marketing of the technology, which they believe is "dangerously misleading and deceptive", giving consumers the false impression that their vehicles are self-driving or autonomous. UK safety experts have called Tesla's Autopilot "especially misleading."
Drivers have been found sleeping at the wheel and driving under the influence of alcohol with Autopilot engaged. Tesla hasn't implemented more robust driver monitoring features that could prevent distracted and inattentive driving.
Autopilot is unable to detect stationary vehicles, which has led to numerous crashes with stopped emergency vehicles.

Future development

According to Elon Musk, full autonomy is "really a software limitation: The hardware exists to create full autonomy, so it's really about developing advanced, narrow AI for the car to operate on." The Autopilot development focus is on "increasingly sophisticated neural nets that can operate in reasonably sized computers in the car”. According to Musk, "the car will learn over time", including from other cars.

Statistics

Tesla's self-reported quarterly summary statistics for vehicle safety in Q4 2019 reported one accident for every 3.07 million miles driven in which drivers had Autopilot engaged, compared with one accident for every 2.10 million miles driven for Tesla vehicles driven without Autopilot and without active safety features.
In 2016, data after 47 million miles of driving in Autopilot mode showed the probability of an accident was at least 50% lower when using Autopilot. During the investigation into the fatal crash of May 2016 in Williston, Florida, NHTSA released a preliminary report in January 2017 stating "the Tesla vehicles' crash rate dropped by almost 40 percent after Autosteer installation." Disputing this, in 2019, a private company, Quality Control Systems, released their report analyzing the same data, stating the NHTSA conclusion was "not well-founded". Quality Control Systems' analysis of the data showed the crash rate actually increased from 0.76 to 1.21 after the installation of Autosteer.
, Andrej Karpathy, Tesla’s head of AI and computer vision, states that: Tesla cars have driven 3 billion miles on Autopilot, of which 1 billion have been driven using Navigate on Autopilot; Tesla cars have performed 200,000 automated lane changes; and 1.2 million Smart Summon sessions have been initiated with Tesla cars. He also states that Tesla cars are avoiding pedestrian accidents at a rate of tens to hundreds per day.

Academic studies

A study from the Center for Transportation and Logistics at the Massachusetts Institute of Technology in 2019 reported the results of an analysis of 18,928 disengagements of Autopilot in real-world usage. The study showed that patterns of decreased vigilance, while common in human-machine interaction paradigms, are not inherent to AI-assisted driving. In particular, the central observation in the dataset was that drivers use Autopilot for 34.8% of their driven miles, and yet appear to maintain a relatively high degree of functional vigilance.

Behavior

notes that the brake system tends to initiate later than some drivers expect. One driver claimed that Tesla's Autopilot failed to brake, resulting in collisions. Tesla pointed out that the driver deactivated the cruise control of the car prior to the crash. Ars Technica also notes that the lane changes are semi-automatic; the car may auto lane change without any driver input if the vehicle detects slow moving cars or if it is required to stay on route but the driver must show the car that he or she is paying attention by touching the steering wheel before the car makes the change.
In a 2019 Bloomberg survey, hundreds of Tesla owners reported dangerous behaviors with Autopilot, such as phantom braking, veering out of lane, or failing to stop for road hazards. Autopilot users have also reported the software crashing and turning off suddenly, collisions with off ramp barriers, radar failures, unexpected swerving, tailgating, and uneven speed changes.

Comparisons and Awards

In 2018, Consumer Reports rated Tesla Autopilot as second best out of four ”partially automated driving systems”. Autopilot scored highly for its capabilities and ease of use, but was worse at keeping the driver engaged than the other manufacturers' systems. Consumer Reports also found multiple problems with Autopilot's automatic lane change function, such as cutting too closely in front of other cars and passing on the right.
In 2018, the Insurance Institute for Highway Safety compared Tesla, BMW, Mercedes and Volvo "advanced driver assistance systems" and stated that the Tesla Model 3 experienced the fewest incidents of crossing over a lane line, touching a lane line, or disengaging.
In 2019, Tesla Autopilot's navigate-on-autopilot feature won the Connected Car Innovation Award in Germany for "best innovative automotive solution".
In February 2020, Cadillac Super Cruise, Comma.ai and Autopilot were compared by Car and Driver. The conclusion: Autopilot was extremely capable and was the most versatile, but can it evolve to full self-driving?
In June 2020, Cadillac Super Cruise self-driving were compared; the conclusion was "Super Cruise is more advanced, while Autopilot is more comprehensive."

Incidents

Handan (January 20, 2016)

On January 20, 2016, the driver of a Tesla Model S in Handan, China, was killed when their car crashed into a stationary truck. The Tesla was following a car in the far left lane of a multi-lane highway; the car in front moved to the right lane to avoid a truck stopped on the left shoulder, and the Tesla, which the driver's father believes was in Autopilot mode, did not slow before colliding with the stopped truck. According to footage captured by a dashboard camera, the stationary street sweeper on the left side of the expressway partially extended into the far left lane, and the driver did not appear to respond to the unexpected obstacle.
In September 2016, the media reported the driver's family had filed a lawsuit in July against the Tesla dealer who sold the car. The family's lawyer stated the suit was intended "to let the public know that self-driving technology has some defects. We are hoping Tesla when marketing its products, will be more cautious. Do not just use self-driving as a selling point for young people." Tesla released a statement which said they "have no way of knowing whether or not Autopilot was engaged at the time of the crash" since the car telemetry could not be retrieved remotely due to damage caused by the crash. In 2018, the lawsuit is in a stalemate because telemetry was recorded locally to a SD card and was not able to give to Tesla, who provided a decoding key to a third party for independent review. Tesla stated that "while the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed." Chinese media later report that his families actually sending the information from that card to Tesla US in parallel, while waiting for the Tesla China's statement, Tesla US, however, after receiving the telemetry data, they admitted autopilot was engaged two minutes before the crash.

Williston, Florida (May 7, 2016)

The first publicized fatal accident involving a Tesla engaged in Autopilot mode took place in Williston, Florida, on May 7, 2016. The driver was killed in a crash with an 18-wheel tractor-trailer. By late June 2016, the U.S. National Highway Traffic Safety Administration opened a formal investigation into the fatal autonomous accident, working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the 2015 Tesla model S at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck’s trailer. The diagnostic log of the Tesla indicated it was traveling at a speed of when it collided with and traveled under the trailer, which was not equipped with a side underrun protection system. The underride collision sheared off the Tesla's glasshouse, destroying everything above the beltline, and caused fatal injuries to the driver. In the approximately nine seconds after colliding with the trailer, the Tesla traveled another and came to rest after colliding with two chain-link fences and a utility pole.
The NHTSA's preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involves a population of an estimated 25,000 Model S cars. On July 8, 2016, the NHTSA requested Tesla Inc. to hand over to the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla's planned updates scheduled for the next four months.
According to Tesla, "neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied." The car attempted to drive full speed under the trailer, "with the bottom of the trailer impacting the windshield of the Model S." Tesla also stated that this was Tesla’s first known Autopilot-related death in over 130 million miles driven by its customers while Autopilot was activated. According to Tesla there is a fatality every 94 million miles among all type of vehicles in the U.S. It is estimated that billions of miles will need to be traveled before Tesla Autopilot can claim to be safer than humans with statistical significance. Researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass-market use.
The truck's driver told the Associated Press that he could hear a Harry Potter movie playing in the crashed car, and said the car was driving so quickly that "he went so fast through my trailer I didn't see him." "It was still playing when he died and snapped a telephone pole a quarter mile down the road." According to the Florida Highway Patrol, they found in the wreckage an aftermarket portable DVD player. It is not possible to watch videos on the Model S touchscreen display. A laptop computer was recovered during the post-crash examination of the wreck, along with an adjustable vehicle laptop mount attached to the front passenger's seat frame. The NHTSA concluded the laptop was probably mounted and the driver may have been distracted at the time of the crash.
In July 2016, the U.S. National Transportation Safety Board announced it had opened a formal investigation into the fatal accident while Autopilot was engaged. The NTSB is an investigative body that only has the power to make policy recommendations. An agency spokesman said, "It's worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible." The NTSB opens annually about 25 to 30 highway investigations. In September 2017, the NTSB released its report, determining that "the probable cause of the Williston, Florida, crash was the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation, which resulted in the car driver’s lack of reaction to the presence of the truck. Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer."
In January 2017, the NHTSA Office of Defects Investigations released a preliminary evaluation, finding that the driver in the crash had seven seconds to see the truck and identifying no defects in the Autopilot system; the ODI also found that the Tesla car crash rate dropped by 40 percent after Autosteer installation, but later also clarified that it did not assess the effectiveness of this technology or whether it was engaged in its crash rate comparison. The NHTSA Special Crash Investigation team published its report in January 2018. According to the report, for the drive leading up to the crash, the driver engaged Autopilot for 37 minutes and 26 seconds, and the system provided 13 "hands not detected" alerts, to which the driver responded after an average delay of 16 seconds. The report concluded "Regardless of the operational status of the Tesla’s ADAS technologies, the driver was still responsible for maintaining ultimate control of the vehicle. All evidence and data gathered concluded that the driver neglected to maintain complete control of the Tesla leading up to the
crash."

Culver City, California (January 22, 2018)

On January 22, 2018, a 2014 Tesla Model S crashed into a fire truck parked on the side of the I-405 freeway in Culver City, California, while traveling at a speed exceeding and the driver survived with no injuries. The driver told the Culver City Fire Department that he was using Autopilot. The fire truck and a California Highway Patrol vehicle were parked diagonally across the left emergency lane and high-occupancy vehicle lane of the southbound 405, blocking off the scene of an earlier accident, with emergency lights flashing.
According to a post-accident interview, the driver stated he was drinking coffee, eating a bagel, and maintaining contact with the steering wheel while resting his hand on his knee. During the trip, which lasted 66 minutes, the Autopilot system was engaged for slightly more than 29 minutes; of the 29 minutes, hands were detected on the steering wheel for only 78 seconds in total. Hands were detected applying torque to the steering wheel for only 51 seconds over the nearly 14 minutes immediately preceding the crash. The Tesla had been following a lead vehicle in the high-occupancy vehicle lane at approximately ; when the lead vehicle moved to the right to avoid the fire truck, approximately three or four seconds prior to impact, the Tesla's traffic-aware cruise control system began to accelerate the Tesla to its preset speed of. When the impact occurred, the Tesla had accelerated to. The Autopilot system issued a forward collision warning half a second before the impact, but did not engage the automatic emergency braking system, and the driver did not manually intervene by braking or steering. Because Autopilot requires agreement between the radar and visual cameras to initiate AEB, the system was challenged due to the specific scenario and the limited time available after the forward collision warning.
Several news outlets started reporting that Autopilot may not detect stationary vehicles at highway speeds and it cannot detect some objects. Raj Rajkumar, who studies autonomous driving systems at Carnegie Mellon University, believes the radars used for Autopilot are designed to detect moving objects, but are "not very good in detecting stationary objects". Both NTSB and NHTSA dispatched teams to investigate the crash. Hod Lipson, director of Columbia University's Creative Machines Lab, faulted the diffusion of responsibility concept: "If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that's a dangerous thing."
In August 2019, the NTSB released its accident brief for the accident. HAB-19-07 concluded the driver of the Tesla was at fault due to "inattention and overreliance on the vehicle's advanced driver assistance system", but added the design of the Tesla Autopilot system "permitted the driver to disengage from the driving task". After the earlier crash in Williston, the NTSB issued a safety recommendation to "evelop applications to more effectively sense the driver's level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use." Among the manufacturers that the recommendation was issued to, only Tesla has failed to issue a response.

Mountain View, California (March 23, 2018)

On March 23, 2018, a second U.S. Autopilot fatality occurred in Mountain View, California. The crash occurred just before 9:30 A.M. on southbound US 101 at the carpool lane exit for southbound Highway 85, at a concrete barrier where the left-hand carpool lane offramp separates from 101. After the Model X crashed into the narrow concrete barrier, it was struck by two following vehicles, and then it caught on fire.
Both the NHTSA and NTSB are investigating the March 2018 crash. Another driver of a Model S demonstrated that Autopilot appeared to be confused by the road surface marking in April 2018. The gore ahead of the barrier is marked by diverging solid white lines and the Autosteer feature of the Model S appeared to mistakenly use the left-side white line instead of the right-side white line as the lane marking for the far left lane, which would have led the Model S into the same concrete barrier had the driver not taken control. Ars Technica concluded "that as Autopilot gets better, drivers could become increasingly complacent and pay less and less attention to the road."
In a corporate blog post, Tesla noted the impact attenuator separating the offramp from US 101 had been previously crushed and not replaced prior to the Model X crash on March 23. The post also stated that Autopilot was engaged at the time of the crash, and the driver's hands had not been detected manipulating the steering wheel for six seconds before the crash. Vehicle data showed the driver had five seconds and a "unobstructed view of the concrete divider, but the vehicle logs show that no action was taken." The NTSB investigation had been focused on the damaged impact attenuator and the vehicle fire after the collision, but after it was reported the driver had complained about the Autopilot functionality, the NTSB announced it would also investigate "all aspects of this crash including the driver’s previous concerns about the autopilot." A NTSB spokesman stated the organization "is unhappy with the release of investigative information by Tesla". Elon Musk dismissed the criticism, tweeting that NTSB was "an advisory body" and that "Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe." In response, NTSB removed Tesla as a party to the investigation on April 11.
NTSB released a preliminary report on June 7, 2018, which provided the recorded telemetry of the Model X and other factual details. Autopilot was engaged continuously for almost nineteen minutes prior to the crash. In the minute before the crash, the driver's hands were detected on the steering wheel for 34 seconds in total, but his hands were not detected for the six seconds immediately preceding the crash. Seven seconds before the crash, the Tesla began to steer to the left and was following a lead vehicle; four seconds before the crash, the Tesla was no longer following a lead vehicle; and during the three seconds before the crash, the Tesla's speed increased to. The driver was wearing a seatbelt and was pulled from the vehicle before it was engulfed in flames.
The crash attenuator had been previously damaged on March 12 and had not been replaced at the time of the Tesla crash. The driver involved in the accident on March 12 collided with the crash attenuator at more than and was treated for minor injuries; in comparison, the driver of the Tesla collided with the collapsed attenuator at a slower speed and died from blunt force trauma. After the accident on March 12, the California Highway Patrol failed to report the collapsed attenuator to Caltrans as required. Caltrans was not aware of the damage until March 20, and the attenuator was not replaced until March 26 because a spare was not immediately available. This specific attenuator had required repair more often than any other crash attenuator in the Bay Area, and maintenance records indicated that repair of this attenuator was delayed by up to three months after being damaged. As a result, the NTSB released a Safety Recommendation Report on September 9, 2019, asking Caltrans to develop and implement a plan to guarantee timely repair of traffic safety hardware.
At a NTSB meeting held on February 25, 2020, the board concluded the crash was caused by a combination of the limitations of the Tesla Autopilot system, the driver's over-reliance on Autopilot, and driver distraction likely from playing a video game on his phone. The vehicle's ineffective monitoring of driver engagement was cited as a contributing factor, and the inoperability of the crash attenuator contributed to the driver's injuries. As an advisory agency, NTSB does not have regulatory power; however, NTSB made several recommendations to two regulatory agencies. The NTSB recommendations to the NHTSA included: expanding the scope of the New Car Assessment Program to include testing of forward collision avoidance systems; determining if "the ability to operate outside the intended operational design domain pose an unreasonable risk to safety"; and developing driver monitoring system performance standards. The NTSB submitted recommendations to the OSHA relating to distracted driving awareness and regulation. In addition, NTSB issued recommendations to manufacturers of portable electronic devices and to Apple.
Several NTSB recommendations previously issued to NHTSA, DOT, and Tesla were reclassified to "Open—Unacceptable Response". These included H-17-41 and H-17-42.

South Jordan, Utah (May 11, 2018)

In the evening of May 11, 2018, a 2016 Tesla Model S with Autopilot engaged crashed into the rear of a fire truck that was stopped in the southbound lane at a red light in South Jordan, Utah, at the intersection of SR-154 and SR-151. The Tesla was moving at an estimated and did not appear to brake or attempt to avoid the impact, according to witnesses. The driver of the Tesla, who survived the impact with a broken foot, admitted she was looking at her phone before the crash. The NHTSA dispatched investigators to South Jordan. According to telemetry data recovered after the crash, the driver repeatedly did not touch the wheel, including during the 80 seconds immediately preceding the crash, and only touched the brake pedal "fractions of a second" before the crash. The driver was cited by police for "failure to keep proper lookout". The Tesla had slowed to to match a vehicle ahead of it, and after that vehicle changed lanes, accelerated to in the 3.5 seconds preceding the crash.
Tesla CEO Elon Musk criticized news coverage of the South Jordan crash, tweeting that "a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage", additionally pointing out that "n impact at that speed usually results in severe injury or death", but later conceding that Autopilot "certainly needs to be better & we work to improve it every day". In September 2018, the driver of the Tesla sued the manufacturer, alleging the safety features designed to "ensure the vehicle would stop on its own in the event of an obstacle being present in the path... failed to engage as advertised." According to the driver, the Tesla failed to provide an audible or visual warning before the crash.

Delray Beach, Florida (March 1, 2019)

In the morning of March 1, 2019, a Tesla Model 3 driving southbound on US 441/SR 7 in Delray Beach, Florida struck a semi-trailer truck that was making a left-hand turn to northbound SR 7 out of a private driveway at Pero Family Farms; the Tesla underrode the trailer, and the force of the impact sheared off the greenhouse of the Model 3, resulting in the death of the Tesla driver. The driver of the Tesla had engaged Autopilot approximately 10 seconds before the collision and preliminary telemetry showed the vehicle did not detect the driver's hands on the wheel for the eight seconds immediately preceding the collision. The driver of the semi-trailer truck was not cited. Both the NHTSA and NTSB dispatched investigators to the scene.
In May 2019 the NTSB issued a preliminary report that determined that neither the driver of the Tesla or the Autopilot system executed evasive maneuvers. The circumstances of this crash were similar to the fatal underride crash of a Tesla Model S in 2016 near Williston, Florida; in its 2017 report detailing the investigation of that earlier crash, NTSB recommended that Autopilot be used only on limited-access roads, which Tesla did not implement.

Moscow (August 10, 2019)

On the night of August 10, 2019, a Tesla Model 3 driving in the left-hand lane on the Moscow Ring Road in Moscow, Russia crashed into a parked tow truck with a corner protruding into the lane and subsequently burst into flames. According to the driver, the vehicle was traveling at the speed limit of with Autopilot activated; he also claimed his hands were on the wheel, but was not paying attention at the time of the crash. All occupants were able to exit the vehicle before it caught on fire; they were transported to the hospital. Injuries included a broken leg and bruises.
The force of the collision was enough to push the tow truck forward into the central dividing wall, as recorded by a surveillance camera. Passersby also captured several videos of the fire and explosions after the accident, these videos also show the tow truck that the Tesla crashed into had been moved, suggesting the explosions of the Model 3 happened later.

Taiwan (June 1, 2020)

Traffic cameras captured the moment when a Tesla slammed into an overturned cargo truck in Taiwan on June 1, 2020. The driver was uninjured and told emergency responders that the car was in Autopilot mode, but did not have its Full Self-Driving Capability feature engaged. The driver reportedly told authorities that he saw the truck and manually hit the brakes too late to avoid the crash, which is apparently indicated on the video by a puff of white smoke coming from the tires.