- Seems like another article based on the assumption that Nvidia just sits there doing nothing while everyone who has so far proven unable to compete suddenly figures it out and steals their lunch.
At some point one of these Nvidia doomers will be right but there is a long line of them who failed miserably.
- > everyone who has so far proven unable to compete
The article explains that Nvidia's biggest customers (50% of datacenter revenue) are switching to their own hardware.
- I'm skeptical of this. They have been doing this for a decade already. So far, those same companies have just become bigger and bigger customers of NVIDIA.
NVIDIA is very strategic about building product to avoid commodification -- both by building out network effects where software is tied to their proprietary sdk libraries, and by always focusing on being at the cutting edge of product.
Both these things can be true: a large company should try to build their own hardware to reduce supplier risk, and a large company should be open to suppliers that have better product that delivers business value.
So far, these large companies' internal hardware has been useful internally but never a complete replacement for NVIDIA, which keeps staying at the cutting edge of new capabilities.
NVIDIA already faced existential risk when Intel was commodifying all the dedicated motherboard components in the late 90s, 2000s, (like sound cards etc), so they're hyper-aware of this.
- > So far, these large companies' internal hardware has been useful internally but never a complete replacement for NVIDIA
Google use their own hardware for AI/HPC. Nvidia hardware is offered to external customers who demand it on Google Cloud Platform.
The AI datacenter explosion has only occurred within the past two years. We are talking about plans that will take years to implement. The hyperscalers are trying to cut out Nvidia as soon as possible.
- I'm baffled seeing this. I'm absolutely convinced of Google value prop, knowing Gemini 2.5 pro was trained and inference is performed on their TPUs is enormous. It's far and way the best model (Claude still has my heart, but their usage limits and limited context can't compete).
The company that can allow collaboration up the value chain only like Google can in this space right now is going to win. The author went over similar pushes by Meta, AWS, and Microsoft.
Now that their eyes are on the prize, and the prize is so staggeringly big, I'm convinced of the threat to their moat.
- I agree. In the exact same way that hyperscalers now build some of their own network stuff, they still buy an absolute boat load from traditional (specialist) vendors.
- Google's Gemma new QAT training comes from google using their own TPU's
don't know much about the rest tho
- But then they're not boosting NVidia's competitors either, so wouldn't NVidia stay in the top position in their market ?
The article seems focused more on stock price and where to bet, than the market for GPUs or generic hardware vendors.
- > But then they're not boosting NVidia's competitors either, so wouldn't NVidia stay in the top position in their market ?
Nope; if hypothetically 100% left NVidia, whether to their own hardware or to not use GPUs at all, it'd be easy to say NVidia would be last in the market
- Are you arguing that the major cloud providers getting away from consumer GPUs would put AMD or Intel ahead of Nvidia in the GPU market ? How does that happen when the next biggest market (gaming) is still fully Nvidia's turf?
- > Are you arguing that the major cloud providers getting away from consumer GPUs would put AMD or Intel ahead of Nvidia in the GPU market ?
Just the hypothetical "if X% of your customers leave, but don't go to competitors, won't you keep your relative market position"
- He's talking about datacenter GPUs which are not really GPUs.
Gaming is not entirely Nvidia's turf either. AMD has 75% of the console market (Xbox and Playstation) and 30% of the PC market.
- > AMD has 75% of the console market (Xbox and Playstation) and 30% of the PC market.
Looking at these numbers[0] for the console market:
> Sony has sold 61.94M units of PS5 and Microsoft has sold 30.14M units of Xbox Series X|S and Nintendo Switch sold 143.49M units.
Nintendo (NVidia chip) sold 1.5x of Sony and Microsoft combined. Given Switch's success the numbers look reliable to me, then Switch 2 is of course also from Nvidia, and I wouldn't be against it selling well.
[0] https://hookedontech.com/switch-vs-ps5-vs-series-x-console-g...
- Old numbers. PS5 has sold 75 million
https://www.ign.com/articles/ps5-has-best-holiday-ever-overa...
In terms of revenue, AMD has the vast majority of the console market.
Yes the Switch sold more units, but that is a smaller SOC with much lower revenue.
Revenue and profit is what matters, not the number of units sold.
Check the price of the PS5 Pro vs Switch.
AMD has dominated the console market since Xbox One / PS4.
Nvidia could be earning $30 - $50 per Switch, while AMD earns $120 - $200 per PS5/PS5 Pro.
- By switching to their own hardware, they become NVidia's competitors.
- That's kind of arguing that people cooking at home are restaurants' competitors.
There's a kernel of truth in it, but if I was McDonalds I'd care a lot more about what KFC is doing than the market trends of cast iron pans.
- I'd say so, yes. If I'm McDonald's and me and kfc are down, but Ralph's is up, I'd suspect a worrying trend about my industry.
It's kind of like how people may make fun of YouTube seeing TikTok as a competitor. But you dig deeper and realize why they decided to get into short for content.
- Bad analogy
90% of Nvidia's revenue is from datacenters.
If the datacenters stop buying Nvidia's products, and use their own hardware instead, then Nvidia loses 90% of its revenue.
- > Nvidia loses 90% of its revenue.
These 90% will be a flash in the pan, same way the COVID revenue was for mask sellers. Sure it feels bad from Nvidia's perspective, but we also can understand that the AI boom would not have kept Nvidia skyrocketing infinitely anyway.
- No, you wouldn't, not if that market trend causes your revenue to plummet. Changes in the size of the market matter just as much as market share.
- The 3, multi-trillion dollar hyperscalers using their own GPUs are individuals cooking at home? And Nvidia is the “restaurants”?
- I'd say yes. We see the same dynamic with megacorp building in-house kitchens and paying staff to feed their employees, who stop going into restaurants at lunch.
It's a huge shift, but not something that can be acted by Nvidia, nor something they had to care about 5 years ago, nor a primary concern 5 years from now. On the long scale, it's almost as if the big fireworks that pushed Nvidia to its curent valuation just disappears and they're back to selling GPUs to OEMs and consumers primarily.
- And something like 90% of Nvidia's revenue at this point is from the datacenter market.
- As a gamer, I selfishly wish that Nvidia would go back to being a gaming company instead of a crypto and AI compute company. I miss MSRP graphics cards :(
- Gentle reminder a lot of mid - lower end graphics card are far from MSRP. And arguably it is crypto and AI compute on server that partly subsided some of the R&D cost on consumer GPU.
- Seems like the opposite to me: https://www.datacenterdynamics.com/en/news/google-in-talks-t...
- You linked to an article about Google renting an insignificant amount of additional capacity.
Google runs AI / HPC workloads on their own hardware and has been doing that for more than a decade. Google Gemini was trained on TPUs developed in house. It does not run on Nvidia hardware.
- And I believe Apple refuses to use NVidia as well, they’re actually using Google’s TPUs.
https://www.tomshardware.com/tech-industry/artificial-intell...
- That was for training. For inference, they reportedly use their own silicon.
- There was a rumor that emerged a few weeks ago that they broke down and made an order from nvidia:
https://finance.yahoo.com/news/apple-might-ai-game-1-1951003...
Before that, in a Wired article from 10 years ago about siri and ai, one of the apple higher ups was quoted bragging about having one of the baddest gpu farms around (paraphrasing)
- Apple is definitely using Nvidia hardware.
- Google, last month: “we’re doubling down on our partnership with NVIDIA”
- That's Google Cloud Platform. Of course they will provide Nvidia hardware as demanded by external customers.
But their internal workloads and their frontier model (Gemini) runs on TPUs.
- Yes, but the cloud customers "who finance TPUs" have NO INTEREST in TPUs and in Nvidia GPUs instead.
How does Google pay for TPUs internally? By Google Search and Google Cloud of course. Google Search uses TPUs, Google Cloud however has way more non-TPUs instances.
What people forget, nobody wants to switch from CUDA dependency to SW dependency on Google/AWS/Azure. CUDA at least allows me to use it in consumer, in pro HW, in cloud and AND in on-prem data center.
I'm really looking forward to Fortune 500 companies sending all their internal company data to Google to structure it to train custom AI models. Yeah, that will never happen. What happens instead is that Fortune 500 companies will build up AI expertise to build their own custom AI model and they will think hard if they want the training AI compute internally or on a cloud. Nvidia has a huge business of building data centers on-premises which people totally oversee. NO CSP will ever compete there because it's against their primary business model. A Reliance India contract from 2023 alone is a delivery of 2 million GPUs in a few years. That's probably more than Nvidia's last year's total revenue and that is 1 large corp in India only.
- That’s the fundamental premise of the article: The hyperscalers will consolidate GPU compute exactly as they consolidated all other forms of compute. Including highly sensitive compute like product design and customer databases.
You can argue they won’t, but the “enterprises won’t put sensitive data in Cloud” ship sailed years ago.
- > I'm really looking forward to Fortune 500 companies sending all their internal company data to Google
Their internal company data is already on cloud servers. They’re not going to waste money on doing it all in house. The executives will buy the AI service from Google/Azure/AWS, where the company data is already hosted, avoid the costs and risk of doing it in house, and collect their bonus.
- NVIDIA ain't spent much time in the NFL else they would've known "...when you’re bleeding a guy you don’t squeeze him dry right away. Contrarily, you let him do his bidding suavely. So you can bleed him next week and the week after at minimum."
- It's true, predicting Nvidia's downfall has become a recurring theme. It's easy to underestimate a company that consistently adapts and innovates. Maybe the narrative isn't about "stealing their lunch" but rather carving out specialized niches.
- Actually, it doesn't. Everyone who saw the recent GTC knows about what they're planning to launch - what people do not get is that with the economic slowdown and price premium, everyone is looking to get more out of their current investments and the premium does not exist anymore.
Fair Disclosure: I am very neutral when it comes to FLOPS/W/$ and the generality of those FLOPS. Given inference and training, the advantage is slipping.
- So you believe Nvidia announces everything they're currently working on at each GTC?
- I guess some people just want to doom, but after getting into stocks late in life, I can't shake the feeling that some do it for a purpose.
- Google is investing in QC quite a bit, I wonder if Nvidia has. Even Nvidia is going to look antiquated someday.
- Apparently QC now stands for Quantum Computing for anyone else like me who is wondering why Google is investing in Quality Control.
- One of the most important contributors to QC is Nvidia because they try to help with GPUs in creating SW for QC and simulations. Nvidia has cuda quantum for years now.
- Not every problem is solved faster by a QC. For many things using a QC would be pure waste.
- [dead]
- Interesting, Marvell is actually down over 50% this year. I just don't understand the bear case at all. I'm a nobody and I'm still willing to buy a $1500 gpu, and that GPU still can't do what the cloud does. The next $1500 gpu probably can't either. It feels like we're over thinking this. The hardware roll-out is all there is imho. Jensen has mentioned he sees Nvidia being a 10 trillion-dollar company, and I'm willing to meet him half-way with my faith here.
Edit:
- I wonder what's stopping Nvida from releasing an AI phone
- A LLM competitor service (Hey, how about you guys make your own chips?)
- They are already releasing an AI PC
- Their own self driving cars
- Their own robots
If you mess with them, why won't they just compete with you?
Just wanted to say one more thing, that Warren Buffet famously said he regretted not investing in both Google and Apple. I think something like this is happening again, especially as there are lulls that the mainstream public perceives, but enthusiasts don't. To maintain the hyperbole, if you are not a full believer as a developer, then you are simply out of your mind.
- >I wonder what's stopping Nvida from releasing an AI phone
It's a low margin business and would hurt the balance sheet more than the completely irrelevant revenue from a project like that.
I've been investing in semi for decades and what strikes me about this recent cycle is that so many don't seem to understand that semi is a highly cyclical business that is prone to commoditization waves and inventory/capacity overbuild.
And speaking as a trader, instead of reinforcing your firmly held base case, I'd strongly consider painting out the bear cases. Look at the roadmaps of the hyperscalers that are designing their own chips for internal use, etc. And never use the word faith when it comes to markets.
You could easily see NVIDIAs margins get chopped down, and see the multiple re-rate lower from here. Actually, I'd argue the name is well on the way down this path already.
It's almost guaranteed to happen sooner or later. Semi down cycles are usually brutal for semi equities.
That's not to say it isn't a great company. It's certainly not a Buffett name though.
- One thing you have to consider is that these other non-hardware dedicated companies have to continuously create new generations of AI chips. You can't sit on the M3, or the Google TPUs, you have to keep making new and better ones. How many companies think they can do this stuff in-house and then eventually realize that they are better off relying on a dedicated vendor? One or two leadership changes and they will cut these initiatives entirely (ask the Zuck about the Metaverse), but Nvidia's whole purpose is to make GPU hardware so they can never truly cut their heart out.
The cyclical stuff was the argument made for semis during the 2010s when no one gave a shit about semis really. I think the game changed, but again, I do operate on faith, or in investor terms, conviction. The main evidence for why the game has changed to me (well, other than AI being the most incredible piece of tech we ever built) is mostly that there are companies that have no business making chip hardware now interested in making chip hardware. That's not usually part of the cycle.
- I generally feel that many have hard time separating scenarios of company dying and going to bankruptcy and not being as profitable as before and thus being less valuable.
One is often unrealistic and later one is lot more common. And one really should consider later one in long term investments.
- > I wonder what's stopping Nvida from releasing an AI phone
B2C is a hellish headache that has marginal returns if you are not B2C first, and the amount of investment needed to be B2C competent just isn't worth it when there are alternative options to invest in
> A LLM competitor service (Hey, how about you guys make your own chips
Already exists. AI Foundary
> They are already releasing an AI PC
It's just an OEM rehash
> Their own self driving cars
Not worth the headache and also losing customers like Google or Amazon due to competitive pressure
------
Cannot reply: releasing their own "Nvidia Car" means they will lose their existing automotive partners becuase they will not spend on a competitor. Same reason Walmart stipulates EVERY tech vendor must decouple from AWS when selling to them.
- > Walmart stipulates EVERY tech vendor must decouple from AWS when selling to them.
I'm curious to know more about this if you (or anyone else) can elaborate on it.
What constitutes a tech vendor? Are you talking about Walmart buying PCs from Dell, from buying/renting a SaaS from someone, from IT-consulting coming in to do a one-time service for them (even if that service takes years)?
You're not talking about stuff like "Apple wants to sell iPhones to Walmart customers", I assume - yes?
- They have a whole huge self driving car division, they just partner with existing car companies.
- NVidia is perfect tech for Software Defined Vehicles. Maybe if they made a car to kick start the industry, it might go places.
- Gaming GPUs is a side business for them at this point. It’s all about AI.
- Gosh, I have a coworker who acts like gaming GPUs are the only hardware that matters.
I've tried explaining that one or two AI data center clients for Nvidia dwarfs the entire gaming GPU market, but he just doesn't get it.
- > Gosh, I have a coworker who acts like gaming GPUs are the only hardware that matters.
> I've tried explaining that one or two AI data center clients for Nvidia dwarfs the entire gaming GPU market, but he just doesn't get it.
I have a feeling that the different judgements come from the fact that the coworker thinks that the AI bubble will soon burst - thus, in his judgement, the AI data center sector of Nvidia is insanely overvalued, and will collapse. What will "save" Nvidia then will be the gaming GPUs. Thus, in his opinion, this is the sector that matters most for Nvidia, since it will become Nvidia's lifeline when (not "if"! - in your coworker's judgement) things will go wrong in AI.
You, on the other hand believe AI data centers are here to stay (which is a bold assumption: it could happen that AI will move more to the edge), and no big competition will arise for NVidia for "big AI ASICs" (another bold assumption). Your judgment is based on these two strong assumptions about the future, while your coworker's is based on different (possibly similarly bold) assumptions.
- His coworker simply has no idea that datacenters provide 93% of Nvidia's revenue, and a single B200 GPU sells for $40k.
- > His coworker simply has no idea that datacenters provide 93% of Nvidia's revenue, and a single B200 GPU sells for $40k.
Currently. :-)
- Real beats imaginary, though.
- The company will unlikely make it through a collapse like that. It’s like Netflix going back to DVDs via mail.
The market is so tiny that their capex investments into AI stuff would catch them with massive debt that the gaming revenue couldn’t support and they would have to go through bankruptcy
- Data centers and enterprise will always dwarf consumer hardware. AI has nothing to do with it.
You're silly if you think otherwise.
- > Data centers and enterprise will always dwarf consumer hardware.
Before the current AI hype, except for some rather specialized applications, people had rather little use for GPU acceleration (GPGPU) in data centers.
- Consumer hardware is where the innovation slowly starts though. Without the years of people messing with GPGPU on consumer hardware we might not have got the AI revolution on GPUs.
- Around 15 years ago in college, I took the opposite position that datacenter workloads mattered more to Nvidia’s future than gaming and no one believed me. It is amazing how times have changed.
- Sort of reminds me of ios vs macos. I'm pretty sure the ios market dwarfs macos.
EDIT: brief search says last year apple sold 300m ios vs 20m macos devices.
- AI makes a lot of money. But games are what matter
- Gaming was sort of their major thing until the cryptocurrency wave gave them a major boost and GPUs were rare luxuries suddenly. But as crypto was fading and GPUs were suddenly becoming affordable, GenAI wave hit and again gave a major boost. Am curious how they will react to quantum(sadly no magical enterprise application apparent yet so the tail wind is not strong).
- simulations. Drugs work in the brain at a quantum level, and if we could simulate what a drug does before testing on live humans, that would change the world.
- The supply, driver, and hardware issues that the 5000 generation of gaming GPUs has right now do show that gaming GPUs are an afterthought to them.
- To drive this point home, look at the nvidia valuation in 2010 or so before crypto gave them a non-gaming dominant business line.
- And yet, Nvidia innovates in gaming with SW 10x more than anyone. Strange side business it is which is also worth billions by the way.
Did you know that Nvidia has a gaming cloud running which might become the largest in the world at some time?
In 10-20 years, Nvidia might make more revenue from gaming cloud than they do today with gaming HW.
- Their upcoming AI PC is dead of arrival since they announced its underwhelming memory bandwidth.
- They'll do ~$110 billion in operating income over the next four quarters, with a mere 36,000 employees and no meaningful dividend or debt to maintain. I think they can trivially afford to keep trying if they see a market.
- One interesting aspect of NVIDIA workforce that many people aren't aware of is that it's very good at retaining people. The annual turnover rate is something like 5%, and last time they had layoffs was in 2008. They are also maintaining full on remote work at the time most other big tech companies are forcing their peons into the offices.
And that buys a lot of loyalty. Which translates to productivity.
- It doesn’t translate to productivity. What it actually means is that it’s an ossified company where everyone is 10 years behind the curve in everything that is not designing hardware. Hardware is a slow business so this is ok. Software is fast moving. Nvidia has extreme difficulty making software that works due to its culture. (Remember installing drivers and how much of a mess that is? That’s their core software product! Docker images and using cloud machines where the drivers are already installed have made this simpler for AI applications.)
- That’s what everyone says up until everyone realizes the actual point when they could no longer actually afford it was 6 months prior.
- I didn’t realize Marvell stock was down so much. I would’ve thought that there might’ve been an explosion in training and the need for ssd’s (Marvell makes controllers for enterprise drives iirc), and that Marvell would be doing well. NAND drive prices for consumers are a good amount from their lows from 2023, I figure if ssd’s weren’t moving for consumers, maybe datacenters and enterprise were gobbling them up.
Anyway, maybe marvell should focus more on the the consumer side since hobbyists seem to be building crazy ai rigs and likely needs drives, at the very least, for models. It sort of seems like hobbyists are devouring any worthy gpu that gets produced.
- I agree the concern here about Nvidia's long term viability seems overblown.
WRT your edit: The answer to all of this is that it's very hard and requires a huge amount of investment to produce good vertical solutions in each of these spaces. You cannot build a good AI phone without first building a good phone. You cannot build a self-driving car without starting with a good car, etc. For robots, I'll point you to someone using Nvidia chips: the Matic is a complete ground-up rethink of how robot vacuums should work. It's taken them 7 years to get to early adopter phase.
- > You cannot build a good AI phone without first building a good phone. You cannot build a self-driving car without starting with a good car, etc.
More like you cannot build a self-driving car without starting with a good phone. See Huawei.
- > I just don't understand the bear case at all. I'm a nobody and I'm still willing to buy a $1500 gpu, and that GPU still can't do what the cloud does.
Agreed, and for all the "price crash" I still can't just whip out my credit card and purchase an hour or two on an H100/B100.
It's still multi-year contracts and "Contact Sales".
- Because Nvidia focuses on being a partner in each industry you mention.
See it that way, if you have an OS/SW for all the industries you mention then who is your competitor? Not the participants in that industries. Nvidia can partner with any automotive company but won't compete with any of them as long as they don't build cars. But imagine the potential of every self driving car being build using Nvidia AI?
Think about the potential of every robot build using Nvidia AI?
Think about the potential of any AI Service using Nvidia AI?
See, Nvidia isn't directly competing in the enduser market but instead focuses on the B2B. Nvidia can also create many different revenue streams from 1 customer.
For example an automotive customer: - Nvidia HW in car for AI - Nvidia data center on-prem/cloud for DriveSim in car - Nvidia Omniverse for car design and manufacturing simulation - Nvidia Isaac for robotics/logistics in manufacturing plant - Nvidia Cosmos+GR00T for robots inside the plant - Nvidia edge devices inside any robot in the plant - Nvidia NeMo data center on-prem/cloud for AI models / LLMs for internal use
And what will be the advantage? Nvidia can actually make it more and more seamless to operate between all Nvidia solutions. For example, you can do an update to your robots in Cosmos, simulate it in Omniverse and with 1 click update your Nvidia driven real robots. The alternative is that you have 3 solutions from 3 different vendors with no interface between them.
People have no idea, what Nvidia is actually creating. Nvidia has more SW engineers and even Nvidia employees call Nvidia an AI SW company. They publish so many libs and lots of other SW stuff that it's sometimes hard to keep up. Just look at all the RTX goodies for gamers which Nvidia is developing. And they are all free, well except that you need Nvidia HW for it. The same model, Nvidia will apply to ALL industries in the world. And here they discuss about CSPs being an issue for Nvidia while Jensen focuses to build a Mega Corp. which potential TAM is in every industry in the world :)
- Services! Services services services!
This is what will help protect Nvidia now that DC and cluster spend is cooling.
They own the ecosystem thanks to CUDA, Infiniband, NGC, NVLink, and other key tools. Now they should add additional applications (the AI Foundry is a good way to do that), or forays into adjacent spaces like white-labeled cluster management.
Working on building custom designs and consulting on custom GPU projects would be helpful as well by helping monetize their existing design practice during slower markets.
Of course, Nvidia is starting to do both, with Nvidia AI Foundry for the former and is working on the latter by starting a GPU architecture and design consulting as announced at GTC and under McKinney
- > They own the ecosystem thanks to CUDA, Infiniband, NGC, NVLink,
No they do not. The article explains that Google, Amazon, Microsoft, and Meta are developing their own hardware and software for AI/HPC.
Google Gemini was not trained using CUDA or Nvidia hardware.
- Only one of the 4 companies you mention is successful at this. And it will remain that way.
Chinese CSPs are the only ones can develop their own hardware / software for AI / HPC.
- Wrong. It's all explained in the article.
https://www.reuters.com/technology/artificial-intelligence/m...
https://azure.microsoft.com/en-us/blog/azure-maia-for-the-er...
- Of course corporations will have a lot of different bets. Most of them will not pan out but they will try.
Meta will not be able to produce a chip that can run GenAI workload in the next 2 years.
Microsoft is doing a side-quest, and they haven't even proved themselves with their FPGA adventure and ARM server adventure.
Amazon is legit, they have done well on ARM server, but trainium is TBD, and how much they will pull back in a recession given Jassy is a number guy will be a question mark.
No need to discuss, we can just see this in 2 years, everything will be crystal clear.
- Yeah that's right. The chip may not be as good as Nvidia's, but it doesn't need to be. As the article explains, Nvidia can still lose their position even if they have the best chips.
- They have to be competitive. TPUs are wildly ahead of the pack. And even they aren't particularly competitive. 12 years of ecosystem development by the most advanced AI ecosystem company on the planet and your (ex-Google!) researchers are still going to pelt you with tomatoes if you tell them you are swapping out their H100 cluster with TPUs. JAX remains niche (not saying bad) and extremely hard to use efficiently without the help of Google (no CUDA for going off the beaten path).
I suspect the closed nature of the ecosystem will preclude them from winning as much as they could.
- > Working on building custom designs and consulting on custom GPU projects would be helpful as well by helping monetize their existing design practice during slower markets.
Apart from Nintendo, who has successfully partnered with Nvidia? Apple, Microsoft and Sony have all been burnt in the past.
- National Labs (kinda), a big pharma company I don't think I can disclose, and a couple HFTs, but it's a muscle they will need to build out, because Broadcom are Marvell are eating their cake.
Nvidia has started formalizing that last year [0], but it's a new muscle for them.
[0] - https://www.reuters.com/technology/nvidia-chases-30-billion-...
- Watch the Nvidia GTC keynote. The list of partners is extensive.
- Being a partner on the GTC slidedecks isn't remotely good evidence that they haven't been burned by Nvidia.
- Yeah, that immediately came to mind—they talk about distributed systems being a problem, but Nvidia owns the battle-tested and well-regarded HPC networking hardware (Infiniband).
There’s maybe some wiggle room, in that these AI distributed systems might not (?) look like HPC/scientific computing systems—maybe they don’t need Infiniband style low latency. So these other funky networks might work.
But like, Nvidia has the good nodes and the good network. That’s a rough combination to compete against.
- I am starting to think AMD is doing this on purpose and they have some secret handshake deal with Nvidia. Nvidia has at least two more years of “sellout at any price” market. Not because they have the best solution (which they do atm) but because they basically share the monopoly with Apple at TSMC. And Apple is content wasting that away on iPhones.
- CEOs are cousins so there's that ...
- The author completely underestimates NVIDIA's strategic position. They don't need to win the hardware game forever - they're building the entire AI stack: hardware, networking, software, models, developer tools. Nobody else is doing this comprehensively. While hyperscalers are making custom chips for their own use cases, NVIDIA is building a unified platform that everyone else will use. This isn't about who makes the best GPU, it's about who builds the ecosystem that becomes the industry standard.
- Actually, Nvidia has already built the ecosystem. Now, they are refining and adapting it to the fast research in AI.
Others talk about chips when Nvidia thought about interconnects 8 years ago. Today, competitors try to catch up on this while Nvidia talks about One Giant GPU.
The next step will be scale up and then scale out.
Nvidia is always ahead because what the article fails to see is that where CSPs are today is where Nvidia was in the last decade. Nvidia has a working ecosystem for everyone which they can now fine tune with actual customers.
- > While the H100 generation likely represents peak pricing power (new B200s have lower margins and higher COGS), an immediate lack of alternatives means they’ll continue to print cash.
That's not a trend yet. We're about to enter an era where most media is generated. Demand is only going to go up, and margins may not matter if volume goes up.
> The open question is long-term (>6yrs) durability1. Hyperscalers (Google, Microsoft, Amazon, and Meta) are aggressively consolidating AI demand to become the dominant consumers of AI accelerators; while developing competitive, highly-credible chip efforts.
Hyperscalers aren't the only players building large GPU farms. There are large foundation model companies doing it too, and there are also new clouds that offer compute outside of the hyperscaler offerings (CoreWeave, Lambda, and dozens of others). Granted, these may be a drop in the bucket and hyperscalers may still win this trend.
- Most media might be generated, but it remains to be seen whether most media that people will pay for will be generated.
- You don't pay for Facebook, Instagram, Reddit, TikTok. Most don't pay for YouTube.
But to your point, Disney is using GenAI in their new live action Moana film. Presumably that'll do lots of sales.
- You do absolutely pay for all those things with your attention, if genAI content doesn't hold people's attention then your ad space will be cheap.
> Disney is using genAI in their new live action Moana film
If this is your bar then sure, but I think the interesting question is when/if we cross into a regime where mostly self-managed genAI is in true competition with the media you consume day to day. Not something that's hundreds of thousands of person hours being enhanced by genAI. I don't think there's any chance we see a total collapse of demand for this stuff but I think the jury is still out on how valuable it truly is imo.
I'm sure it's non zero and I expect demand will continue to rise, but it may plateau or slow sooner than people think, I just don't think we can clearly say yet.
- They’re basically going from a functional monopoly to having to compete.
Not ideal for them but hardly a death blow
- How can it be "slipping" if they sell out of all their stuff years in advance? I still can't find any sanely priced 5090s. And before you point out that 5090s are not their main revenue driver, they're sold out of H100s and so on years in advance, too.
- Same way they lost the crypo mining market.
They are losing their biggest customers to custom in-house silicon, and smaller orders are going to compete with a market being flooded by superfluous hardware from companies which either went bust due the the AI bubble shrinking, or went bust because they weren't able to compete with the big fish.
- The AI market will be drastically larger in 10-15 years, not smaller. The bubble aspects of the present will be trivial compared to the long-term result, as with the dotcom bubble. Google all by itself is worth more today than all the combined dotcoms in existence in 1999.
- Now scale it by M2 money supply (or gold price if you will). Will paint completely different picture.
- I don't think they even wanted to be in the crypto mining market. They artificially hobbled their GPUs specifically so that they wouldn't perform well there.
- And how many 5090 were actually produced? It is easy to sell out with low volume production.
- Consumer hardware is nothing for Nvidia. Gamers need to realize that, lol.
- The article is about HPC/AI where they are quickly becoming less competitive.
Gaming is only 7% of Nvidia's revenue.
- Oct 2024
- nvidia's gpu drivers qa is definitely slipping
- When takes like these go mainstream (Financial Times, etc) I buy.
- I loaded calls this afternoon, two months out so I am not too scared.
I figure regardless of tariffs and competition and other fluctuations the demand for computing power will endlessly trend upwards and as much as we can produce will be consumed.
Good luck to you.
- Even if this were true it doesn't necessarily mean it will translate into stock performance. Expected growth is priced in to the current stock price. Also there are lots of "computing power" companies that are not doing too hot (intel sells "computing power"). Just because there is growth in the sector doesn't mean a given company will do well.
Personally, I have found NVIDIA to be one of the most hyped stocks I have ever seen and it feels weird to take a position that it is under-hyped. That said, people aren't capable of beating the market consistently so I would never invest based on this, or any intuition or information I had.