X Close

Nvidia’s boom is not a straightforward American success story

Nvidia CEO Jensen Huang. Credit: Getty

May 30, 2023 - 7:00am

In what has been a bleak year for Silicon Valley, the sudden surge in the value of tech company Nvidia, driven by its mastery of chips used for artificial intelligence, may seem like a ray of hope. Yet if this success may reward the firm’s owners and employees, as well as the tech-oriented financial speculators, the blessings may not rebound so well to the industry’s workforce overall, or to the broader interests of the West.

Nvidia’s rise as the first trillion-dollar semiconductor firm reinforces the de-industrialisation of the tech economy. Unlike the traditional market leaders, like Intel, Nvidia does not manufacture its own chips, choosing instead to rely largely on the expertise of Taiwanese semiconductors. It has limited blue-collar employment. Intel, a big manufacturer, has 120,000 employees — more than four times as many as the more highly valued Nvidia, which epitomises the increasingly non-material character of the Valley.

The company’s value has been tied directly to the profitable, if socially ruinous, expansion of digital media, notably video games and now artificial intelligence. AI is the new crack cocaine of the digital age, with the power to lure people into an ever more artificial environment while providing a substitute for original human thinking and creativity. But it is enormously promising as a potential teaching tool (perhaps obviating the need for professors) and in areas such as law enforcement.

Industry boosters see Nvidia’s rise as straightforwardly good, and as a sign of continued American dominance of the chip industry. They talk boldly of seizing the “high ground” while competitors take over all the basic tasks of actually making things. But as we saw in cars, consumer electronics and the making of cell phones, America’s key competitors — China, South Korea and Japan — are not likely to be satisfied with being hod carriers to the luminaries of Silicon Valley. China’s stated goal is not to partner but to dominate AI, with all the power it gives the country to control both consumer products and people’s minds, not to mention bolstering its military supremacy. This process will be greatly aided by the takeover or total intimidation of Taiwan, leaving Nvidia and other “fabless” chip firms increasingly at Beijing’s mercy.

Nvidia’s CEO, Jensen Huang, seems enthused about serving the Chinese market. He believes the Middle Kingdom is in position to dominate the tech economy. Not surprisingly, he bristles against attempts by Congress and the Biden White House to curb sales of advanced chips to China. “There is nothing that is slowing down China’s development of technology. Nothing. They don’t need any more inspirations,” Huang remarked recently. “China is full steam ahead.” Given this assumption, it’s logical that Nvidia wants free rein to sell its latest and greatest to Beijing, without much in the way of restriction.

How this benefits humanity, or the cause of democracy, seems dubious but, like much of Silicon Valley, Nvidia thinks little about anything as trivial as the national interest or human rights. The tech oligarchs seem more than willing to sell the instruments of our own undoing for a pretty penny. Nor, given its post-industrial structure, does the company’s rise promise much for most Americans who need better opportunities. Besides, it’s unlikely that the super-geeks at organisations like Nvidia face any prospect of unemployment; even now, sales of luxury houses in places like Portola Valley are still reaching new heights.

Some may see the rise of Nvidia and AI as the harbinger of a technological utopia. What’s more likely is that it’s just one more step on the road to a deadening techno-feudalism and the emergence of an autocratic world system centred on China.


Joel Kotkin is the Hobbs Presidential Fellow in Urban Futures at Chapman University and author, most recently, of The Coming of Neo-Feudalism: A Warning to the Global Middle Class (Encounter)

joelkotkin

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

13 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Steve Murray
Steve Murray
10 months ago

There are similar issues with the failure of Manchester University (close to where i live in the UK) to patent graphene, first discovered there nearly 20 years ago. Manufacturing rights are already devolving to the Far East, including China.
But this is essentially a human and political story. When Manchester was at the epicentre of the Industrial Revolution, the human stories of workers moving from the countryside into factories resulted in Marx basing many of his theories from his experiences in the city. How the manufacturing of microchips progresses in the 21st century may well mirror the manufacturing boom in the 19th century. One wonders if there is a latter-day political thinker waiting in the wings? The lack of freedom of thought in authoritarian regimes suggests that’s much less likely. To be clear – i’m absolutely no fan of Marx, but it cannot be argued that much of the development of political thought in the modern world has devolved either from his writings or in reaction to them. These techno-industrial issues are society-changing, and we’re living it.

Nick Faulks
Nick Faulks
10 months ago
Reply to  Steve Murray

I doubt that a Manchester patent on graphene would have carried much weight in China.

Steve Murray
Steve Murray
10 months ago
Reply to  Nick Faulks

Probably true, but the lack of an attempt to secure a patent is the more important point.

Steve Murray
Steve Murray
10 months ago
Reply to  Nick Faulks

Probably true, but the lack of an attempt to secure a patent is the more important point.

Nell Clover
Nell Clover
10 months ago
Reply to  Steve Murray

It’s not possible to patent a discovery. Manchester University didn’t have a useful industrial process for making it, which otherwise might have been patentable. On the patenting of possible graphene applications, there are already lots of parents for applications of generic materials meeting one or more properties of graphene; finding a new application made possible by graphene not covered by another parent is an immense challenge simply because there are already so many existing patents to avoid infringing. And then there’s the challenge and cost of defending a patent that will last no more than 25 years, and defending a patent against claims of copying another. Not patenting graphene isn’t a failure, the failure is the inability of universities, investors and Britain in general to commercialise ideas.

Nick Faulks
Nick Faulks
10 months ago
Reply to  Steve Murray

I doubt that a Manchester patent on graphene would have carried much weight in China.

Nell Clover
Nell Clover
10 months ago
Reply to  Steve Murray

It’s not possible to patent a discovery. Manchester University didn’t have a useful industrial process for making it, which otherwise might have been patentable. On the patenting of possible graphene applications, there are already lots of parents for applications of generic materials meeting one or more properties of graphene; finding a new application made possible by graphene not covered by another parent is an immense challenge simply because there are already so many existing patents to avoid infringing. And then there’s the challenge and cost of defending a patent that will last no more than 25 years, and defending a patent against claims of copying another. Not patenting graphene isn’t a failure, the failure is the inability of universities, investors and Britain in general to commercialise ideas.

Steve Murray
Steve Murray
10 months ago

There are similar issues with the failure of Manchester University (close to where i live in the UK) to patent graphene, first discovered there nearly 20 years ago. Manufacturing rights are already devolving to the Far East, including China.
But this is essentially a human and political story. When Manchester was at the epicentre of the Industrial Revolution, the human stories of workers moving from the countryside into factories resulted in Marx basing many of his theories from his experiences in the city. How the manufacturing of microchips progresses in the 21st century may well mirror the manufacturing boom in the 19th century. One wonders if there is a latter-day political thinker waiting in the wings? The lack of freedom of thought in authoritarian regimes suggests that’s much less likely. To be clear – i’m absolutely no fan of Marx, but it cannot be argued that much of the development of political thought in the modern world has devolved either from his writings or in reaction to them. These techno-industrial issues are society-changing, and we’re living it.

Andrew Dalton
Andrew Dalton
10 months ago

nVidia have specialised in designing PCBs that can perform enormous quantities of parallel floating point calculations. That is not the forte of a CPU, which are typically “all rounders” and much more easily programmed.

Video games, at least those with real time rendered 3D graphics require vast amounts of mathematical computation. 25 years ago, a number of companies were competing in this fairly new industry (with some overlap with visual FX for movies and TV). nVidia were more or less the only real survivor from that period, although ATI merged with AMD. Intel had no comparable capabilities at the time.

As it turns out, the technology has progressed to such an extent that modern GPUs are effectively super computers. The original technology specialising in rendering 3D images has evolved to the point that it may be adapted to machine learning, CAD and scientific and engineering modelling. It can perform these tasks orders of magnitude faster than a classic CPU.

Notwithstanding whether AI will be force for good or ill (in my view, probably both, like most technologies although I remain pessimistic), the hardware nVidia produce is of fundamental use and painting the company as somehow dubious because of its contribution to gaming and AI like some kind of crack dealer is ridiculous.

Andrew Dalton
Andrew Dalton
10 months ago

nVidia have specialised in designing PCBs that can perform enormous quantities of parallel floating point calculations. That is not the forte of a CPU, which are typically “all rounders” and much more easily programmed.

Video games, at least those with real time rendered 3D graphics require vast amounts of mathematical computation. 25 years ago, a number of companies were competing in this fairly new industry (with some overlap with visual FX for movies and TV). nVidia were more or less the only real survivor from that period, although ATI merged with AMD. Intel had no comparable capabilities at the time.

As it turns out, the technology has progressed to such an extent that modern GPUs are effectively super computers. The original technology specialising in rendering 3D images has evolved to the point that it may be adapted to machine learning, CAD and scientific and engineering modelling. It can perform these tasks orders of magnitude faster than a classic CPU.

Notwithstanding whether AI will be force for good or ill (in my view, probably both, like most technologies although I remain pessimistic), the hardware nVidia produce is of fundamental use and painting the company as somehow dubious because of its contribution to gaming and AI like some kind of crack dealer is ridiculous.

Peter B
Peter B
10 months ago

Not surprised by this frankly ignorant anti-tech rant.
nVidia’s success is far from an overnight thing – it’s taken 30 years and is not an accident. What they do is incredibly difficult and they are the best in the world at it. The idea that China can just come along and copy it is laughable.
The author is probably confusing the fake Silicon Valley companies – think Theranos – with the real ones like nVidia.
His idea that Intel is somehow “better” than nVidia because it makes it’s own chips is equally ludicrous.
#1 No one makes advanced chips in the Valley any more – it’s far too expensive
#2 Intel makes some of its chips outside the US
#3 Intel makes a good – and increasing – proportion of its chips at TSMC too (just like nVidia)
It’s almost as if this author hasn’t read Adam Smith on the division of labour. Of course Silicon Valley (which used to be the centre of chip manufacturing 50 years ago) now specialises in the highest value activities. It wouldn’t exist if it didn’t.

Last edited 10 months ago by Peter B
Andrew Dalton
Andrew Dalton
10 months ago
Reply to  Peter B

Thumbs up. I posted some similar stuff but apparently I’m in approval purgatory.

Andrew Dalton
Andrew Dalton
10 months ago
Reply to  Peter B

Thumbs up. I posted some similar stuff but apparently I’m in approval purgatory.

Peter B
Peter B
10 months ago

Not surprised by this frankly ignorant anti-tech rant.
nVidia’s success is far from an overnight thing – it’s taken 30 years and is not an accident. What they do is incredibly difficult and they are the best in the world at it. The idea that China can just come along and copy it is laughable.
The author is probably confusing the fake Silicon Valley companies – think Theranos – with the real ones like nVidia.
His idea that Intel is somehow “better” than nVidia because it makes it’s own chips is equally ludicrous.
#1 No one makes advanced chips in the Valley any more – it’s far too expensive
#2 Intel makes some of its chips outside the US
#3 Intel makes a good – and increasing – proportion of its chips at TSMC too (just like nVidia)
It’s almost as if this author hasn’t read Adam Smith on the division of labour. Of course Silicon Valley (which used to be the centre of chip manufacturing 50 years ago) now specialises in the highest value activities. It wouldn’t exist if it didn’t.

Last edited 10 months ago by Peter B
Alan Gore
Alan Gore
10 months ago

But at the same time, Taiwan is rapidly building new semiconductor manufacturing capacity in the US. Every time I pass the building Phoenix Loop 303 interchange on I-17, I see the gigantic pair of TSMC buildings, also under construction, rising higher against the sky with every passing week. Nvidia will have no trouble sourcing for its GPUs and VPUs when Taiwan falls. In fact, the Pentagon has already gone public with plans to destroy the TSMC fabs there when that happens, leaving the Chicoms with an empty shell.

Last edited 10 months ago by Alan Gore
Alan Gore
Alan Gore
10 months ago

But at the same time, Taiwan is rapidly building new semiconductor manufacturing capacity in the US. Every time I pass the building Phoenix Loop 303 interchange on I-17, I see the gigantic pair of TSMC buildings, also under construction, rising higher against the sky with every passing week. Nvidia will have no trouble sourcing for its GPUs and VPUs when Taiwan falls. In fact, the Pentagon has already gone public with plans to destroy the TSMC fabs there when that happens, leaving the Chicoms with an empty shell.

Last edited 10 months ago by Alan Gore
Peter Kwasi-Modo
Peter Kwasi-Modo
10 months ago

In the development of new computer hardware, most new-design components are more energy-efficient.than the components they replace in the latest hardware offerings. The big exception to this is the GPU, i.e. the component which Nvidia produces. The GPU model they rleased in 2021 consumes 350W under load, whereas their 2022 model consumes 450W under load, i.e. about half an electric fire. So an AI application, such as training a large language model, consumes several 100,000kWh’s of energy. But there is also a problem with computer games software. They tend to keep the graphics refresh rate high even when the gamer is just looking at a menu, so the Nvidia GPU continues to operate at near full-stretch.
Neither AI nor games are as bad as bitcoin in this respect.

Andrew Dalton
Andrew Dalton
10 months ago

They are more energy efficient when measured by operations per watt. The problem is that “Moore’s Law is dead” and we’re not seeing the gains we used to (double performance every 18 to 24 months), which is also true in the CPU market.

nVidia are trying to give the performance boost by throwing more silicon at the problem with bigger cards. So a generational 50% performance boost may be made up from a ~20% more efficiency from transistor shrinking and optimisations and ~20% larger PCB.

I agree with you that the situation is getting out of hand though. I’ve taken to frame rate locking anything that doesn’t high FPS to keep my electricity bill down, and it’s actually noticeable! I’ve also long since turned the radiator off in the room with my PC.

Peter Kwasi-Modo
Peter Kwasi-Modo
10 months ago
Reply to  Andrew Dalton

The problem is the fault of the games software designers, not nVidia. The games software folks could economise on energy by adjusting the frame refresh rate according to the current usage in the game. I set this as a project for computer science students. I got them to monitor power consumption whilst they played their favourite computer games. For each of the games tested, the results were the same. The more powerful the graphicscard, the higher the frame refresh rate, much faster than the screens can handle.

Andrew Dalton
Andrew Dalton
10 months ago

Sounds like an interesting project. I was always aware of this coming from a background in computer science and image rendering, but you only really understand when you need to pay your own electricity bills!
Low rates of on screen motion do not require high rates of FPS – things like strategy games could easily get by with 30/40FPS. Basically anything that isn’t fast moving/first person.

Andrew Dalton
Andrew Dalton
10 months ago

Sounds like an interesting project. I was always aware of this coming from a background in computer science and image rendering, but you only really understand when you need to pay your own electricity bills!
Low rates of on screen motion do not require high rates of FPS – things like strategy games could easily get by with 30/40FPS. Basically anything that isn’t fast moving/first person.

Peter Kwasi-Modo
Peter Kwasi-Modo
10 months ago
Reply to  Andrew Dalton

The problem is the fault of the games software designers, not nVidia. The games software folks could economise on energy by adjusting the frame refresh rate according to the current usage in the game. I set this as a project for computer science students. I got them to monitor power consumption whilst they played their favourite computer games. For each of the games tested, the results were the same. The more powerful the graphicscard, the higher the frame refresh rate, much faster than the screens can handle.

Andrew Dalton
Andrew Dalton
10 months ago

They are more energy efficient when measured by operations per watt. The problem is that “Moore’s Law is dead” and we’re not seeing the gains we used to (double performance every 18 to 24 months), which is also true in the CPU market.

nVidia are trying to give the performance boost by throwing more silicon at the problem with bigger cards. So a generational 50% performance boost may be made up from a ~20% more efficiency from transistor shrinking and optimisations and ~20% larger PCB.

I agree with you that the situation is getting out of hand though. I’ve taken to frame rate locking anything that doesn’t high FPS to keep my electricity bill down, and it’s actually noticeable! I’ve also long since turned the radiator off in the room with my PC.

Peter Kwasi-Modo
Peter Kwasi-Modo
10 months ago

In the development of new computer hardware, most new-design components are more energy-efficient.than the components they replace in the latest hardware offerings. The big exception to this is the GPU, i.e. the component which Nvidia produces. The GPU model they rleased in 2021 consumes 350W under load, whereas their 2022 model consumes 450W under load, i.e. about half an electric fire. So an AI application, such as training a large language model, consumes several 100,000kWh’s of energy. But there is also a problem with computer games software. They tend to keep the graphics refresh rate high even when the gamer is just looking at a menu, so the Nvidia GPU continues to operate at near full-stretch.
Neither AI nor games are as bad as bitcoin in this respect.

Nicky Samengo-Turner
Nicky Samengo-Turner
10 months ago

superb!

Nicky Samengo-Turner
Nicky Samengo-Turner
10 months ago

superb!