

Discover more from Tech Investments
Introduction
This will be the first episode of a two part series on autonomous vehicles (AVs), which will likely become one of the highest growth arenas during the coming decade. Currently we’re living in a world without much vehicle automation. However, green shoots are starting to emerge. Teslas in FSD (full self driving) mode are gradually requiring less and less interventions, while robotaxis such as Cruise and Baidu are expanding operations in more and more cities. We could be at the start of a classic sigmoid-curved adoption pattern which is typical for new technologies — slow at first, with a gradually increasing slope thereafter until high adoption rates have been reached, after which the slope can flatten again until the new technology is fully adopted. Looking at history, the speed at which technological transitions are taking place has only only been increasing. The transition from horse and carriage to the automobile was relatively slow, whereas the transition from VHS to DVD happened in a span of less than 10 years. And another 10 years later, DVDs were largely replaced with streaming services such as Netflix.
However, even 100 years ago, the transition from horse and carriage to the automobile went incredibly fast in the largest and most advanced cities. One only has to compare New York’s streets in 1903 to those of 1913. In the former, I’m seeing one car parked in the picture..
.. While ten years later, the streets are packed with cars..
Companies exposed to these types of transitions usually provide some of the best investment opportunities. The market frequently underestimates the opportunity set. Especially when the company driving the transition has a sufficient moat to protect itself from new competitors, so that it is in a position to capture the profits from this new market. The operating leverage on a only partly variable cost base can be tremendous.
Ford is an obvious example here of an innovator, enabling the mass adoption of cars with its moving assembly line. However, as new entrants started competing, its dominant market share gradually eroded over subsequent decades. Ford is only a small player on the automotive scene today. The likes of Amazon, Apple, Microsoft, Nvidia and Google are examples of disruptors who’ve been able to hold on to fairly strong market shares.
So, the aim of this two part series will be to identify one of more Nvidias in the autonomous vehicle world ten years from now. Ideally, as smart investors, we’ll be looking for a business with only limited downside risk if the technological transition which we’re anticipating doesn’t occur. We can sell the shares then later on with no or only limited losses to reorient capital towards new openings. On the flipside, we should get large upside in the shares if indeed the technology breaks through. Additionally, we’re looking for a business with a sufficiently wide moat so that new competitors won’t be able to get a foothold in the industry. In autonomous driving, this should be provided by the data advantage of the largest players, as AI training is a scale game. We have here a first solid candidate which comes to mind..
A tour of Mobileye
Mobileye is probably the perfect name for those sceptical on the development of full autonomous driving, but interested in investing in high-growth quality tech names with a strong market share positioning. The company has a hands-off product similar to Tesla FSD, with also eyes-off as well as full robotaxi systems going into production in the coming years.
The hands-off product named ‘SuperVision’ only makes use of cameras. With the next products in the coming years also utilizing a second sensor system made up of imaging radars and one lidar. The latter system will act as a second, independent decision making body. For example, if both the independent vision and radar systems agree on a particular action the vehicle has to make, there should be no or extremely little risk of making a mistake. If the two systems reach a different conclusion, automatically the vehicle can start operating with more caution, giving preference to the recommended action with the lowest probability of causing an accident. The hands- and eyes-off system is called ‘Chauffeur’, while the robotaxi product is named ‘Drive’.
Mobileye’s CEO updated us on current business developments during the Q2 call:
“We can now count nine large established OEM prospects in what we consider advanced stages for products like SuperVision and Chauffeur. In most cases we are not competing against anyone. Currently, this list of OEMs represents about 30% of global volume. This is very encouraging because the vast majority of the rest of the industry remains very open to us. The process is about physical testing to convince the OEM of the performance and the design domain of the system, establishing what role the OEM will have in customizing. What appeals to the OEMs is that our product portfolio is scalable, cost efficient, and, above all, displaying leading and cutting edge performance. The ability to provide efficient and high-probability products across all vehicle price points, for both consumer-owned and mobility-as-a-service (MaaS) solutions, all based on the same proven core technology, is a huge selling point to OEMs.
Our work with Volkswagen Group is a good example. Since 2018, all new vehicles across the group have used Mobileye provided ADAS (advanced driver assistance systems). We have the SuperVision design win with Porsche. Porsche shares common platforms with other premium brands of the Volkswagen Group. While not formalized yet, we expect SuperVision to be adopted by the other premium brands to increase economies of scale. In fact, Audi and Bentley executives are already on record expressing excitement to bring SuperVision to their products. An additional benefit is that it creates a bridge to Chauffeur.”
Mobileye’s product portfolio is pictured below — the neat design of the system allows an OEM simply to add more EyeQ chips and sensors to a vehicle, in order to progress to higher levels of autonomous driving capabilities. EyeQ chips process sensor information to turn this into driving decisions based on the trained AI models.
These EyeQ dies are still being upgraded. Eye6Q is manufactured on Intel’s 7nm node and will have 8 CPU cores, one GPU for image signal processing, and another four AI accelerators. These accelerators are optimized for Mobileye’s AI workloads and are therefore lower cost than general purpose solutions as well as more power efficient, an important factor in vehicles where range is a crucial factor. Eye7Q will be on 5nm, having 12 CPU cores with again one GPU and four accelerators.
The company’s IR detailed their progress here at the Goldman conference:
“EyeQ6 will be our new workhorse chip for high-volume ADAS programs, and launches in Q1 of 2024. EyeQ7 is a single die with all the processing power needed for full self-driving. That's on track as well. But we can easily put two EyeQ6 chips instead — SuperVision is two EyeQ6, Chauffeur on highway is three, and Chauffeur everywhere is four.”
The visual input from a SuperVision system:
Mobileye has a long history of working with leading automotive brands to implement driver assistance systems. From the prospectus: “As of October 2022, our solutions had been installed in approximately 800 vehicle models and our System-on-Chips (SoCs) had been deployed in over 125 million vehicles. We are actively working with more than 50 Original Equipment Manufacturers (OEMs) worldwide on the implementation of our ADAS solutions, and we announced over 40 new design wins in 2021 alone. We currently ship a variety of ADAS solutions to 13 of the 15 largest automakers in the world.” The company’s EyeQ chips are included in more than 7 out of 10 vehicles with a traditional ADAS system. Clearly they have a strong positioning with their traditional business.
Strong relationships are continuing, with the large automotive groups from around the world launching models with Mobileye tech inside:
The automotive industry has witnessed both trends towards consolidation as well as fragmentation. Large automotive groups in the West have been consolidating brands to gain scale and offset market share losses caused by a series of waves of new entrants — the Japanese in the eighties, followed by the Koreans in the nineties, and now we’re in the midst of a new wave with both the Chinese and pure EV manufacturers gaining share. Automotive consolidators in the West include GM, Stellantis, and Volkswagen. Each of which owns a portfolio of brands such as Stellantis with Chrysler, Jeep, Dodge, Ram, Fiat, Peugeot, Opel and Maserati. Similarly Volkswagen manufactures VW, Audi, Porsche, Bentley and Lamborghini vehicles among others.
These relationships between the OEMs and Mobileye are now deepening, as the momentum towards vehicle automation is gaining strength — Mobileye’s CEO: “Working in our favor is an increase in competitive pressure as Tesla and the Chinese startups push the envelope on hands-free technology. We have noted an increase in seriousness within the OEMs over the past one or two years, and have seen some OEMs that appeared to be far away from us on advanced technology move rapidly to align behind our approach. This is all very positive for us as a technology and cost leader.”
There are also OEMs aiming to build their own systems, which is obviously not without risk. Developing innovative tech is extremely hard, especially for legacy or traditional companies who don’t have the culture and often struggle to attract the best engineering talent. However, there will be a few exceptions here such as Mercedes, which seems to have developed capable tech in partnership with Nvidia.
Mobileye’s CEO commented on the trends in competition with OEMs: “We have a large number of serious engagements with OEMs that in the past were very bullish on talking only about in-house development, and we are now around the table talking with them about SuperVision products and beyond. That’s the majority of the competitive landscape. It’s not the likes of Nvidia and Qualcomm. They are offering the tools for in-house development by OEMs, so the competitors are the OEMs themselves. Once we started putting vehicles on the road with our technology where people can test, OEMs can test, now also the public can start testing, the difference is becoming visible. It’s all about cost versus performance, right? Even if they have the same performance as SuperVision, but cost four times more, then it’s not competitive. I think this is becoming visible now that things are really in production.”
SuperVision’s China rollout
Zeekr, a premium EV brand from Chinese OEM Geely, was the first to bring to market the SuperVision system last year, selling 90,000 units in 2022 alone. The software is updated over the air so that the cars are driving on the latest AI models. Mobileye’s IR recently presented at a few conferences to update us on the China rollout:
“The ultimate proof point really happened 2 months ago when we rolled out the SuperVision to the first 1,100 beta users, Zeekr owners. So they experienced the technology for 6 to 8 weeks. Zeekr monitored, got the car into hands of influencers and media people, and reviews were really positive. And then Zeekr had enough confidence to roll it out from 1,100 people to 110,000 people about 2 weeks ago.
The ability to handle much more difficult situations, to be more assertive on the road, like if you're wanting to get off an exit and there's a line of cars, do you get in back of the line? Or do you do the Jersey move and go up and push yourself in front, and that's what the Zeekr vehicle is doing. I've had more than a few investors email me in the last couple of weeks, they went to China, they experienced the system in the Zeekr vehicles and they think it's better than Tesla.
You have systems in China from Li Auto, NIO, XPeng and maybe two others. These systems are internally developed, either using Huawei or Nvidia processors. And they're pretty good. The most common comment we get is the intervention rate of the SuperVision is much lower.
From a cost standpoint, all you really need to do is look at the sensor set on the vehicle and the amount of compute. What we're hearing from inside companies is that these systems are limited in terms of where they operate and are costing $4,000 to $6,000 bill of materials. SuperVision is essentially roughly $1,500 to $2,000 to the OEM. So we feel like we have a significant cost advantage.
We were a little late to enter China, we really started in 2014, where the rest of the world was more like 2007. Bosch had most of that market at that point in radar systems. At the time that we were acquired by Intel, we had maybe 25-30% market share in China. Now we have over 60% market share. So we've done really well on the single front-facing camera ADAS business. And it's become a big chunk of our business, 25% of revenues.”
The reason other AV providers can be limited in terms of the areas where they can operate is that when they’re relying on HD mapping, they need to have a particular area mapped before the AVs can go over it. If you have to map an area with a fleet of scanning vehicles paid for by the provider, this can easily go up into the tens of millions of dollars for a city. Additionally, you need to keep the map updated, so you need to have sufficient vehicles on the road at all times. As Mobileye already has over a hundred million vehicles on the road globally deployed with their EyeQ chip, continuously updated HD maps are already being collected at low cost.
How assertive the AV is driving can be tweaked by the OEM. Mobileye provides an AI model which is able to drive in a variety of styles, which the OEM can then tune to his liking.
The outside of a Zeekr:
And the inside with Mobileye’s navigate on pilot (NOP):
Mobileye’s CEO provided some further details on the Q2 call:
“Zeekr’s system is performing much better than other MLP (multilayer perceptron) systems in terms of ability to complete maneuvers without takeover in many difficult situations like construction areas, highway merges in heavy traffic, and performing lane changes within tight curves. Influencers and media have also highlighted the strength of the system versus competitors, focusing on the assertive human-like performance of the car, several calling it the most efficient and capable navigate-on-pilot they ever experienced.
Any negative feedback has been around some dead spots in the map which will be rapidly built out over the following months. Mapping is key to this. The complexities of mapping in China means that data collection must be done through Chinese partners and as a result the data collections path is much later in China than North America and Europe.
The eyes-on hands-free market is much more developed in China than other regions, and it’s a significant proof point to other OEM customers that Zeekr’s system is outperforming. You have XPeng, Li Auto, Nio — they have product on the road. They have many more sensors than SuperVision. All of them have front-facing lidars. They have much more compute, sometimes somewhere between 10 to 20 times more compute than we have. Very expensive products. When we start doing benchmarking, we are superior in terms of performance in almost every aspect. This supports the feedback we have gotten from other OEMs that have performed benchmark tests of their own.”
Having a strong position with the Chinese OEMs should be attractive over the coming decade as these companies are taking share in the automotive market, both in China as well as overseas. China has now even surpassed Japan and Germany to become the number one car exporter.
Geely, with who Mobileye seems to have a good relationship with, is a reasonably large Chinese automotive group who was the number three Chinese exporter in 2022. The company made $22 billion in revenues over the last twelve months.
Geely owns a large variety of automotive brands, of which Volvo and Smart are probably the most well-known in the West. Zeekr is the fifth logo below, the one between Smart and Lotus. This brand is being promoted as pure EV and high-tech automobiles. Starting prices for the various Zeekr models range from $45,000 to $75,000 — so the product is clearly meant to compete with Tesla.
Although Mobileye is also selling ADAS systems to BYD, Chery, SAIC and Great Wall Motors, several of these have been teaming up with various names such as Horizon Robotics, Nvidia and Qualcomm to develop self-driving systems. Clearly the landscape is fairly competitive.
There could a risk of a Chinese export ban, although I suspect this is low at this stage. Even Nvidia is still allowed to sell H800s into China, a slowed down version of their flagship H100 GPU, which is far more powerful and versatile than a Mobileye EyeQ. The H800 can be used in a ton of applications, whereas an EyeQ is specifically designed for Mobileye’s workloads. Worst case scenario, if a Chinese ban would occur, this would currently impact around 25% of Mobileye revenues. It’d be a blow but it wouldn’t be lethal. Long term, investors could still do well in this name if the company executes well.
Mapping, a competitive advantage
The mapping is an important input for Mobileye’s AI, next to the camera and radar systems. The way it works is that the map continuously provides context to the vehicle how the traffic is normally flowing in its surroundings. So when the vehicle arrives at a complex intersection, the AI will have a lot less to figure out. It already knows the path it’s going to follow, so it only needs to check if other objects might get in the way. If there are temporary construction works for example, it is only then that the system would have to come up with a different path, such as driving around the construction site. Another example is when the car comes up to a complex set of traffic lights. The map will provide context which light belongs to which lane and which light goes to the pedestrians.
Overall this sounds like a logical system. I’ve always been somewhat sceptical of Tesla’s pure vision based methodology but we’ll see in the coming years who’s right. Clearly Tesla has a great FSD system, but nothing close to full autonomy where we can remove the steering wheel from the car and transform it into a robotaxi. With Mobileye’s EyeQ chips being installed on over a 100 million vehicles, the company should have quite an advantage in continuously updating its maps. So if a certain junction is blocked by an accident, the software can already give the signal to the vehicles down to road to take another route. That said, if Mobileye or other manufacturers start developing clearly superior products, Tesla can still pivot towards including context-based maps as a factor. Obviously, they have a strong collection of data as well and thus should be in a good position to remain a strong player in AVs.
Next level, Chauffeur and Drive
The first next-gen product, Chauffeur, is expected to be released on the roads in two years. This is the one that will combine two separate perception and decision making systems, which are then brought together into a final decision making unit. Once again Geely Group will be the first to adopt, this time in their Polestar brand, with Volkswagen to follow.
Mobileye’s CEO on the Q2 call: “ The secondary perception system made up of radar and lidar results in a significant increase in the mean time between failure which is obviously key to enabling eyes-off. In other words, full driver disengagement under a broad set of conditions and road types. This also forms the baseline for our Mobileye Drive mobility-as-a-service (MaaS) solution. On this front there has been recent news on our delivery of multiple self-driving systems which have been integrated into Volkwagen’s ID Buzz for testing in both the US and Europe.
The fact that Volkswagen has recently demonstrated these vehicles with analysts and media after only several months of us working together is a testament to how evolved this technology already is. Customers are telling us that we can save them EUR 100,000 a year by eliminating the driver, and the system will cost much less than that. We expect these Mobileye Drive based vehicle platforms to begin serial production in 2025, which also coincides with volume production of our EyeQ6 based compute platform and our software defined imaging radar, each important for scaling the mobility-as-a-service business.
So the focus is on collaborating with or partnering with platform builders rather than having our own vehicle and customer-facing applications.”
Subsequently Mobileye’s CFO detailed their engagements on the Drive system: “We have three engagements to supply vehicles with our self-driving system by the middle of 2025, including Schaeffler, Holon from the Benteler Group, and Volkswagen.”
Schaeffler is a large tier one automotive supplier with EUR 16 billion of revenues last year. Holon seems to be a new project from Benteler, a German industrial group with EUR 9 billion of revenues, aimed at providing autonomous shuttle busses and robotaxis for e-commerce deliveries. The shuttle bus product will be launched in 2025 in the US. Out of these three, obviously Volkswagen is the most interesting engagement so far. Hopefully more will be announced in the coming twelve months.
Mobileye is also allowing customers to develop their own apps to be run on top of their platform alongside the Mobileye apps:
Mobileye vs Tesla
Mobileye has generally been very complimentary of Tesla, however in a recent blog post by the company’s CTO and founder CEO, they made some interesting criticisms of Tesla’s new end-to-end approach in self-driving. Overall, as previously discussed, Tesla has some of the best capabilities in hardcore engineering and they amazed again with their recent Optimus showing. However, on the topic of full self-driving I’m currently leaning towards Mobileye’s methodology with the addition of HD maps.
From Mobileye’s blog: “Recently, Tesla has indicated they will adopt this approach for end-to-end solving of the self-driving problem. The premise is to switch from a well-engineered system comprised of data-driven components interconnected by many lines of codes to a pure data-driven approach comprised of a single end-to-end neural network.
For transparency, while it may be possible to steer an end-to-end system towards satisfying some regulatory rules, it is hard to see how to give regulators the option to dictate the exact behavior of the system in all situations. In fact, the most recent trend in LLMs is to combine them with symbolic reasoning elements – also known as good, old fashioned coding.
For controllability, end-to-end approaches are an engineering nightmare. Evidence shows that the performance of GPT-4 over time deteriorates as a result of attempts to keep improving the system. This can be attributed to phenomena like catastrophic forgetfulness and other artifacts of RLHF (reinforcement learning from human feedback). Moreover, there is no way to guarantee ‘no lapse of judgement’ for a fully neuronal system.
Regarding performance — i.e. the high mean time between failure (MTBF) requirement — while it may be possible that with massive amounts of data and compute an end-to-end approach will converge to a sufficiently high MTBF, the current evidence does not look promising. Even the most advanced LLMs make embarrassing mistakes quite often. Will we trust them for making safety critical decisions? It is well known to machine learning experts that the most difficult problem of statistical methods is the long tail. The end-to-end approach might look very promising to reach a mildly large MTBF (say, of a few hours), but this is orders of magnitude smaller than the requirement for safe deployment of a self-driving vehicle, and each increase of the MTBF by one order of magnitude becomes harder and harder. It is not surprising that the recent live demonstration of Tesla’s latest FSD by Elon Musk shows an MTBF of roughly one hour.”
Tesla as usual is bullish on the development of FSD, some highlights from Elon Musk on the last call:
“Today, over 300 million miles have been driven using FSD Beta. That 300 million mile number is going to seem small very quickly. It will soon be billions of miles, tens of billions of miles. And the FSD will go from being as good as a human to then being vastly better than a human. I think we'll be better than human by the end of this year. That’s not to say we’re approved by regulators.
In a neural net at a million training examples, it barely works. At 2 million, it slightly works. At 3 million, okay we're seeing something. But then you get to 10 million training examples, it becomes incredible. Our Dojo training computer is designed to significantly reduce the cost of neural net training, it’s optimized for video training. Tesla has more vehicles on the road that are collecting this data than all other companies combined. I think we might have 90% of all data being gathered.
And our future robotaxi products, we think have a quasi-infinite demand. The way we’re going to manufacture the robotaxi is also itself a revolution. But we are very open to licensing our FSD software and hardware to other car companies. And we are already in early discussions with a major OEM.”
I have no doubt that Tesla will be top notch in the manufacturing of their next-gen vehicle. The company is probably the best automotive manufacturer on the planet right now with their innovations in gigacasting and vertical integrations in a wide variety of technologies. This has resulted in Tesla being the only one able to print high margins in EVs, whereas competitors are largely burning cash.
On the Q2 call Mobileye’s CEO discussed new developments regarding their competition with Tesla:
“I think that Tesla has mentioned several times in the past about licensing their FSD, so it’s not really a new concept. I would say that we have lots of respect to what Tesla has accomplished with FSD. In fact, we see their rapid development as a significant positive for us as it pushes the market to move faster to implement advanced solutions like SuperVision. If you look at SuperVision, it’s an FSD-like category: 11 cameras and a few radars. SuperVision is also REM, the high-definition mapping in addition to what FSD can offer. Today we have 120,000 SuperVision enabled vehicles in China, and the response in terms comparative analysis is very good. It’s on par or superior to FSD. That’s measured by the rate of intervention and ability to handle complex maneuvers. REM is a strong differentiation.
But now, let’s look at the cost. The price of a SuperVision system including the cameras, radars, and the software with REM is approximately somewhere in the $2,500 range. Now, if Tesla matches that price then OEMs will be able to offer SuperVision or FSD at less than half the price that FSD is offered to Tesla car owners. This would immediately cannibalize Tesla.
I would also mention — and this bodes well with our OEM customers — there are 400,000 FSDs on the road since 2019 and Mobileye has already 120,000 — and in approximately two years we’ll surpass the one million bar and from there we’ll grow much faster. There are also important differences with respect to access of data, something that Tesla has often highlighted as an advantage. For example, at their March investor day, Tesla noted they had a video cache of 30 petabytes, and were intending to grow to 200 petabytes. Our video database is 400 petabytes. Not to mention all the data that we collect for REM, the high-definition mapping. We collected almost 9 billion miles of this type of data in 2022 alone. Tesla talks about 300 million miles driven to date. We believe that SuperVision is a much more optimal solution for our customers, both in terms of cost, performance, and customization basis.”
In all likelihood, access to data will be crucial to win in AVs. When one of Mobileye’s vehicles makes a mistake, they analyse the event to see if they can find similar edge cases to retrain the system on a variety of these scenarios. Tesla has similar capabilities. However, lack of data will be a serious barrier for new players to enter this field. Resulting in what should become a consolidated market in the medium term with probably only a handful of strong players or less.
Mobileye’s founder, the Israeli Elon Musk
Mobileye’s CEO and founder, Amnon Shashua, is the Elon Musk of Israel. He founded CogniTens in 1995, a 3D dimensional scanning system, which is now part of Hexagon, a Swedish conglomerate in industrial tech. Mobileye was subsequently founded in 1999, to implement his research on detecting objects via cameras and software algorithms. He then founded OrCam in 2010, which provides portable devices that describe the surroundings such as text and objects to visually impaired people. In 2017, he co-founded AI21 Labs, which is a co-pilot which can help you write and understand text. Later, in 2019, he co-founded Mentee Robotics, which is aiming to build humanoid robots, like Tesla’s Optimus.
Shashua still serves as chairman of the latter mentioned firms. In between these projects he also teaches at the Hebrew University of Jerusalem. Overall he has published over 160 papers on machine learning and vision systems.
Mercedes, an unexpected number one?
A number of OEMs are developing their own self-driving systems, often with the help of partners such as Nvidia and Qualcomm. Mobileye thought Xpeng had advanced the furthest in China, however, they are now moving away from a HD map based approach to vision only, similar to Tesla. The reason is that it was too costly for them to do the mapping. In the West, Germany’s Mercedes is the first to receive regulatory approval to market an eyes-off system. From Mercedes’ tech day:
“In Germany, our Level 3 Drive Pilot has been on the market since May of last year. This year, Drive Pilot received certification from the state of Nevada. And very soon, we expect California to certify it too. This generation enables conditionally automated driving at up to 60 kilometers per hour in Germany or 40 miles per hour in the US were allowed. Next year, we aim to increase speeds to approximately 90 kilometers per hour on German autobahn. And to achieve this, our sensing technology is based on multiple systems, including radar, cameras, lidar and more. Redundancy of major components is an important factor here.
This will be augmented by a crowd-sourced HD map to handle the most complex situations. Our vehicles are able to collect up to 300 petabytes of data every year to further train our deep neural networks. And our AI automatically chooses a fraction of this data that is the most useful to improving our software stack. This will be able to deal with the diversity of different geographies from Europe to China, US and beyond. Our ultimate goal is 130 kilometers per hour, around 80 miles per hour. Functionality will include automatic lane change and highway to highway transfer, we intend to roll out these features worldwide. And what you can do in the car once you are in Level 3, it’s about content and entertainment. We are also at the forefront of automated parking. Our intelligent park pilot is the world’s first Level 4 parking system.
Within Nvidia, we have access to software development, AI, and processing skills, that will take us to the next level. We equip the vehicles with our Level 2 features and carry the respective hardware costs, including favorable SoC pricings from Nvidia. Due to the high attractiveness, we assume a high take rate for these features. In the end, we, as Mercedes, generate 100% net sales and share roughly half of it with Nvidia depending on the region. Even by doing so, we still expect a decent contribution margin from this business. However, L3 features are only available as an option.
The R&D spending for our software and corresponding hardware is at a run rate of roughly EUR 1 billion to 2 billion a year. At the same time, we reduced the the investment in ICE (internal combustion engine) as any new platform development will be BEV (battery electric vehicle) only.”
So Nvidia is helping Mercedes in the training of AI models, as well as providing the hardware at a subsidized rate. And in return, Nvidia takes a 50% revenue share for this joint investment. Nvidia co-presented at Mercedes’ last tech day:
“We are using Omniverse to generate all sorts of new scenarios that you could encounter in the world, everything from differences in weather conditions, locations, and people’s behavior. We can simulate all of that end-to-end synthetically. And we can also simulate the sorts of scenarios that you never want to see in real life, dangerous things, people sitting on the road and so forth, and we can test against them.”
The Verge on Mercedes’ Drive Pilot: “According to The Drive, which got to test out the system on a closed course in Germany last year, the driver must keep their face visible to the vehicle’s in-car cameras at all times but can also turn their head to talk to a passenger or play a game on the vehicle’s infotainment screen. But when The Drive reporter brought a camera up to his face to take a picture, the system disengaged. In other words, the system doesn’t allow drivers to take a nap or ride in the vehicle in the backseat. In the past, people have abused the lax driver monitoring controls in Tesla’s Autopilot to do both. Other than that, Drive Pilot acts similarly to many of the Level 2 systems that are available in the US. It accelerates and decelerates, depending on traffic ahead. It can stay centered in the lane and perform automated lane changes and blind spot detection.”
Size of the AV market & Mobileye’s possible market cap
Mobileye’s ASPs will go up dramatically, as SuperVision will be priced around $1,500 per system compared to $50 for a legacy ADAS. And ASPs will only go up further thereafter, with Chauffeur and Drive. Resulting in Mobileye’s serviceable addressable (SAM) market to grow at an expected annual rate of 65%, reaching $455 billion in 2030:
Now, whether this SAM is doable will depend on whether robotaxis and software subscriptions can be introduced. A robotaxi market would be massive, especially on the consumer side in terms of ride hailing revenues, which I’ll discuss in my next note. If you purely make assumptions on AV systems being implemented into traditional consumer cars, you can probably get to a market size of around $50 to perhaps $100 billion, but obviously $455 billion would be a out of reach:
If we assume an AV subscription model for consumers, you can get to a higher number:
In a robotaxi world, the likely business model would be a take rate on the generated revenues per mile. If the overall robotaxi market would take a share of around 15% of the global miles being driven, selling rides at a price of 90 cents per mile, and with the AV system provider taking 20%, this would give a market size of $972 billion:
Obviously the opportunity in autonomous driving is large. Let’s assume a possible market size of around $50 to $200 billion for the moment. Under these possible scenarios, and if Mobileye continues to hold a decent market share combined with healthy EBIT margins, the rewards will be there for investors:
Robotaxis can become a multi-trillion market on the consumer ride hailing side, and if Mobileye can play a crucial part in the value chain, it could become a multi-trillion company in terms of valuation. Currently the company is valued at $33 billion. This is somewhat of a similar set-up to Nvidia in 2016, we knew a potentially large opportunity in AI was there, but it was guesswork how large it was going to be in 5 to 10 years time.
Financials — ticker MBLY on the NASDAQ, share price of $42
In the near term, Mobileye is bullish on their SuperVision rollout, expecting volumes to grow at a 100% CAGR. Selling 1.2 million systems in 2026 would mean an adoption rate of only around 1.7% in the annual car market, so there should be room for upside here. This would give around $1.8 billion of SuperVision revenues in 2026.
The company is in a healthy financial position with $1.1 billion of net cash on the balance sheet while generating healthy free cash flow (FCF) margins. Share based compensation (SBC) should become less of a factor over time and as FCFs continue to grow, they can both be used to offset the dilution coming from SBC as well as to return capital to shareholders in the form of buybacks. Wall Street’s numbers below:
Mobileye’s multiples are fairly similar to Tesla, although with a much higher SBC expenditure. Obviously Mobileye is an extremely R&D intensive business with a current R&D ratio of 45% to sales. As Mobileye is a pure-play on AVs, I suspect its growth rate over the coming decade will be higher than Tesla’s, as the latter is largely exposed to EVs where adoption rates are already at around 15%. Still a good growth story, but advanced AV systems are taking off from scratch. Additionally, AV pricing will continue to increase. So we should get the more attractive volume as well as pricing growth in Mobileye. With the launch of more EV models by more manufacturers, there is also the risk of market share losses for Tesla. Although Mobileye is facing a certain amount of competition as well.
Mobileye has been trading around 55x since IPOing:
The company is based in Jerusalem, so the cost base is largely quoted in Israeli Shekels while revenues are made in USD. So currency movements can influence margins. The new campus in Jerusalem’s tech park is now nearly finished:
Intel has all the voting power in the company controlling 90% of outstanding shares. These Intel shares are also B shares, allowing for 10 votes per share, whereas the listed A shares carry only 1 vote. B shares can be converted at Intel’s choosing to list them as A shares. This low free float for Mobileye, around 10% of outstanding shares, means only around $3 billion in valuation is up for grabs by outside investors, making the company on the small side for the large institutional players — liquidity is fairly low. So as Intel will be listing more shares over time, the company will become more accessible to a broader investor base which should help the valuation.
In the next article, we’ll be looking at the robotaxi ride hailing players, where the total market size will be extremely large. We haven’t really succeeded yet in finding a company with only limited downside, but maybe we’ll be more lucky next time. Nevertheless, Mobileye’s risk-reward should be quite attractive. Although the company clearly will have to execute well in the rolling out of the SuperVision, Chauffeur, and/ or Drive platforms to produce substantial rewards for investors.
Hit subscribe if you don’t want to miss the next chapter. Also, hit the like button and share a link to this article on social media or with others with a positive comment, it will help the publication to grow.
I’m also regularly discussing tech and investments on my Twitter.
Disclaimer - This article is not a recommendation to buy or sell the mentioned securities, it is purely for informational purposes. While I’ve aimed to use accurate and reliable information in writing this, it can not be guaranteed that all information used is of this nature. The views expressed in this article may change over time without giving notice. The future performance of the mentioned securities remains uncertain, with both upside as well as downside scenarios possible. Before investing, I recommend speaking to a financial advisor who can take into account your personal risk profile.
Autonomous vehicles & Mobileye
Another really well done article and great in depth look at Mobileye. It will be interesting to watch this market and see how it shakes out. Hope you get a chance to take a deeper look at the competitive landscape. MBLY certainly ought to be a part of anyone AV investment portfolio.
Hi Tech Fund, I hope you're doing well. Massive fan of your Substack/Twitter.
I am a founder of Zeed (https://zeed.ai/). We're helping creators transform their written content (like this Substack) into dynamic video pieces using AI to capture new audiences.
Would love to show you an example and jump on a call if you're interested in hearing more! Best, Rohan.