Like A Cartoon: Former NASA Engineer Drives A Tesla Into A Painted Wall To Highlight Musk’s Failure

In his latest YouTube video, ex-NASA engineer and well-known content creator, Mark Rober, has unveiled the results of a thorough examination he conducted on Tesla‘s autonomous driving safety systems. More insights about this intriguing experiment are now available to us.

In a video analysis, Mark Rober examined the capability of Tesla’s camera-assisted systems to detect and maneuver around obstacles. By conducting tests against a Lexus RX outfitted with LIDAR technology, he assessed both vehicles’ performance in this regard.

Discussion surrounding the findings of these tests has ignited a heated argument over the safety of Tesla’s Autopilot versus other available technologies. Additionally, Robert faced significant backlash due to suspicions that his video may have been influenced by sponsorship.

The Tests in Detail

In these trials, Robert and his associates utilized a child-like mannequin as a hindrance on the road. To start, the mannequin moved across the street as both the Tesla and Lexus were traveling at approximately 40 miles per hour. The Lexus, equipped with LiDAR technology, reacted swiftly and halted safely to prevent a crash. In contrast, the Tesla Model Y, which depends solely on cameras for assistance, maintained its speed without braking and struck the mannequin, even though it had an automatic emergency braking system installed.

In a following test using Tesla’s Autopilot feature, the car successfully applied the brakes to prevent an accident when needed. Yet, it was noticeable that there was a gap in responsiveness between this system and another one. A disadvantage of Tesla’s Autopilot is the rare instance of phantom braking.

In the subsequent trials, Robert opted to maintain Autopilot mode engaged. These tests were subjected to progressively demanding circumstances. Notably, a challenging situation arose when the Tesla was tasked with identifying a dummy amidst dense fog and heavy rain. Under such conditions, the Tesla’s performance deteriorated significantly compared to the Lexus, which effectively detected the obstacle thanks to its LiDAR technology. However, during a simulation that mimicked sun glare, an interesting incident occurred: as both vehicles approached the dummy with the sun casting a low light on the horizon, the Tesla unexpectedly recognized the obstacle and halted in time – an accomplishment matched by the Lexus.

In this scenario, the final test took an amusing and unrealistic turn. To set up the test, Robert constructed a massive foam block wall that mimicked a typical movie set prop – what some might call a “fake road barrier.” It was challenging even for the human eye to discern the photographed road from the genuine one.

In this setup, the LiDAR-equipped Lexus smoothly stopped as its sensors accurately identified the foam wall as an obstacle and weren’t deceived by the illusion. Conversely, the Tesla, which relied solely on cameras, couldn’t distinguish between the actual road and the depicted wall. The outcome was a sensational collision: the Model Y crashed straight through the foam wall, wrecking the dummy positioned behind it.

This demonstration underscored how crucial it is for the system to be able to differentiate real from artificial obstacles in order for it to function effectively.

Criticism and Reactions

Following the release of the video, Robert received heavy backlash from some viewers who alleged that he edited the video to portray Tesla negatively and boost Luminar, a LiDAR technology firm. The frequent appearance of Luminar employees and branding within the video sparked these doubts. Even Luminar itself featured the video on its website initially, but eventually took it down as criticism intensified.

A different point of debate emerged concerning the title of the video: “Can you trick a self-driving car?” The individual named Rober claimed to have activated Tesla’s Autopilot system, but it’s essential to understand that this is not a fully autonomous driving mode. Instead, it’s a driver-assistance system requiring the driver to stay alert and prepared to take control.

On social media platforms, there was discussion about potential inconsistencies in Rober’s claimed use of Autopilot. Some people suggested that the system wasn’t active during the accident because the central display showed no blue lane markings or the “rainbow road” effect – visual indicators that Autopilot is activated.

In this video, you can see my Tesla driving straight into a wall. It appears that the car disconnected from control seventeen frames prior to impact, even though I wasn’t pressing either the brake or the accelerator at that time.

— Mark Rober (@MarkRober) March 17, 2025

As a devoted Tesla fan, I shared my concerns when Rober responded to the allegations by sharing “raw crash footage” aimed at dismissing speculations. In a recent update on X, he mentioned his uncertainty as to why Autopilot deactivated 17 frames prior to impact, yet affirmed that he hadn’t touched the brake or accelerator. Regrettably, this raw footage added more fuel to the fire of doubt among Tesla enthusiasts, who spotted inconsistencies: in the initial clip, Autopilot was activated at 39 mph, while in the freshly released footage, it was engaged at 42 mph. This discrepancy implies that the test might have been repeated on different occasions.

The occurrences have ignited discussions suggesting that the video might have been orchestrated intentionally to tarnish Tesla’s image, considering Elon Musk’s strong connections with the White House via the Department of Government Efficiency, a circumstance that may cause concern among investors.

It seems clear that the Mark Rober incident requires a resolution. To achieve this, Tesla should retrieve the necessary data directly from the vehicle and present us with an accurate account of the events. This includes details such as the number of attempts, speeds at which it occurred, instances of Autopilot engagement and disengagement, and so on. I have no doubt that the data will confirm this was…

— Blackout Trades (@blackouttrades) March 17, 2025

Viewers spotted an unusual discrepancy concerning the Google Pixel smartphone utilized to film in-car test footage. Although Rober seemingly handled the device, they saw that the “G” emblem on its back stayed upright in the clip, despite the phone being held sideways. This raised questions about whether an iPhone or another device was used instead, and if the “G” logo was digitally inserted afterward. The motive behind such manipulation is uncertain, as Google was neither credited as a sponsor in the video nor mentioned in its description.

Mark Rober: “And here it is! This video will surely be a hit, as everyone seems to dislike Elon these days.”

Tesla community: *Quickly points out flaws in the experiment within 36 hours*

Mark Rober: “Ah, um, well, this is the unedited footage. I’m unsure of what happened, I swear I’m innocent!

— Chris Robovan (@chrisrobovan) March 17, 2025

Conclusion

In his recent video, Mark Rober, famous for his scientific explorations and innovative tests, has sparked a fiery argument regarding the boundaries of Tesla’s camera-based Autopilot technology. Some viewers consider his experiments as a necessary safety evaluation, while others claim he’s deliberately altering the footage to undermine Tesla and advocate for LIDAR technology instead. The question of whether it was an intentional deception or just a means to expose a real flaw in Tesla’s system continues to be debated.

It’s clear that this video has reignited conversations about Autopilot technology and its safety issues, drawing attention to both Elon Musk and Tesla for further examination yet again.

Read More

2025-03-19 19:40