Notebookcheck Logo

Human emotions make L2 systems like supervised Tesla FSD harder to pull off than L4 robotaxis

XPeng bets on advanced self-driving system. (Image source: XPeng)
XPeng bets on advanced self-driving system. (Image source: XPeng)
One of Tesla's key Full Self-Driving feature competitors, XPeng, is betting on advanced vehicle autonomy to stand out, including with AI-trained camera vision-only solution. Human reactions, however, are getting in the way of its autonomous vehicle rollout vision.

One of the self-driving pioneers among Chinese EV makers - XPeng - is also the first to craft its own AI driving chip named Turing, and move to a vision-only system thanks to advancements in machine learning. Its driver-assist system is called XNGP for XPeng Next Generation Pilot and is optimized for autonomous driving in big city traffic.

According to XPeng's self-driving system head Candice Yuan, the company will launch a dedicated robotaxi similar to Tesla's Cybercab next year, and it will use Level 4 vehicle autonomy similar to unsupervised FSD. Its work on the Level 4 autonomous driving system that doesn't require driver input was actually easier than on the Level 2+ XNGP that its vehicles drive on now.

The Level 4 system is perfectly capable of taking split-second decisions, she mentioned, just like Tesla's unsupervised FSD that its pilot Robotaxi service in Austin relies on. For L2 autonomy, however, the developers had to take into account that humans are emotional beings and can often do illogical actions that will get in the way of the self-driving software algorithmic decisions. Yuan then gave several examples with situations that the AI-powered system considers perfectly safe, but the human driver deems close calls and takes over:

Firstly, drivers want efficiency. For example, if ADAS do the right thing, but the vehicle is slower than the driver expects, he will take over. Another example – the system might calculate that the situation is safe enough to go. But the driver thinks it is too narrow or too close, and he will take over. The third example might be that if the vehicle accelerates too fast or too slow, the driver will not feel comfortable and, again, take over. As a result, a greater level 2 autonomous driving for passenger vehicles is more complex in some areas than the L4 we developed in Alibaba for unmanned vehicles.

The driver cares about emotions and feelings, so we need to think as he thinks. We need to focus more on experience... If a robotaxi goes the wrong way, it will continue in that direction, then turn around and follow the new navigation. But in level 2, if you go the wrong way, the driver will take over. So .. it is forbidden to go the wrong way.

The Level 4 autonomous driving system that XPeng has created, on the other hand, learns from short video clips of both good and bad driving practices to determine the best course of action in any given scenario, much like Tesla's unsupervised FSD. XPeng said that the recent advances in machine learning allowed it to ditch LiDAR and use only the cameras its vehicles are equipped with for self-driving. Third-party chip makers, however, couldn't provide the customization that its XNGP system needed to fuse XPeng's vehicle hardware with the homebrew ADAS software, so the automaker had to develop its own tailored AI chip.

The large language model that XPeng's vision-based system uses can be applied anywhere just like Tesla's FSD, and XPeng is currently waiting only for public regulatory systems to get up to speed in order to roll out its self-driving system in places like Europe, for instance, where it recently started localized EV manufacturing in order to avoid import tariffs. 

Get the 80A Tesla Gen 2 Wall Connector with 24' cable on Amazon

Source(s)

static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
Mail Logo
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2025 09 > Human emotions make L2 systems like supervised Tesla FSD harder to pull off than L4 robotaxis
Daniel Zlatev, 2025-09-22 (Update: 2025-09-22)