Subscribe
Notify of
guest

13 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Steve Murray
Steve Murray
11 months ago

There are similar issues with the failure of Manchester University (close to where i live in the UK) to patent graphene, first discovered there nearly 20 years ago. Manufacturing rights are already devolving to the Far East, including China.
But this is essentially a human and political story. When Manchester was at the epicentre of the Industrial Revolution, the human stories of workers moving from the countryside into factories resulted in Marx basing many of his theories from his experiences in the city. How the manufacturing of microchips progresses in the 21st century may well mirror the manufacturing boom in the 19th century. One wonders if there is a latter-day political thinker waiting in the wings? The lack of freedom of thought in authoritarian regimes suggests that’s much less likely. To be clear – i’m absolutely no fan of Marx, but it cannot be argued that much of the development of political thought in the modern world has devolved either from his writings or in reaction to them. These techno-industrial issues are society-changing, and we’re living it.

Nick Faulks
Nick Faulks
11 months ago
Reply to  Steve Murray

I doubt that a Manchester patent on graphene would have carried much weight in China.

Steve Murray
Steve Murray
11 months ago
Reply to  Nick Faulks

Probably true, but the lack of an attempt to secure a patent is the more important point.

Steve Murray
Steve Murray
11 months ago
Reply to  Nick Faulks

Probably true, but the lack of an attempt to secure a patent is the more important point.

Nell Clover
Nell Clover
11 months ago
Reply to  Steve Murray

It’s not possible to patent a discovery. Manchester University didn’t have a useful industrial process for making it, which otherwise might have been patentable. On the patenting of possible graphene applications, there are already lots of parents for applications of generic materials meeting one or more properties of graphene; finding a new application made possible by graphene not covered by another parent is an immense challenge simply because there are already so many existing patents to avoid infringing. And then there’s the challenge and cost of defending a patent that will last no more than 25 years, and defending a patent against claims of copying another. Not patenting graphene isn’t a failure, the failure is the inability of universities, investors and Britain in general to commercialise ideas.

Nick Faulks
Nick Faulks
11 months ago
Reply to  Steve Murray

I doubt that a Manchester patent on graphene would have carried much weight in China.

Nell Clover
Nell Clover
11 months ago
Reply to  Steve Murray

It’s not possible to patent a discovery. Manchester University didn’t have a useful industrial process for making it, which otherwise might have been patentable. On the patenting of possible graphene applications, there are already lots of parents for applications of generic materials meeting one or more properties of graphene; finding a new application made possible by graphene not covered by another parent is an immense challenge simply because there are already so many existing patents to avoid infringing. And then there’s the challenge and cost of defending a patent that will last no more than 25 years, and defending a patent against claims of copying another. Not patenting graphene isn’t a failure, the failure is the inability of universities, investors and Britain in general to commercialise ideas.

Steve Murray
Steve Murray
11 months ago

There are similar issues with the failure of Manchester University (close to where i live in the UK) to patent graphene, first discovered there nearly 20 years ago. Manufacturing rights are already devolving to the Far East, including China.
But this is essentially a human and political story. When Manchester was at the epicentre of the Industrial Revolution, the human stories of workers moving from the countryside into factories resulted in Marx basing many of his theories from his experiences in the city. How the manufacturing of microchips progresses in the 21st century may well mirror the manufacturing boom in the 19th century. One wonders if there is a latter-day political thinker waiting in the wings? The lack of freedom of thought in authoritarian regimes suggests that’s much less likely. To be clear – i’m absolutely no fan of Marx, but it cannot be argued that much of the development of political thought in the modern world has devolved either from his writings or in reaction to them. These techno-industrial issues are society-changing, and we’re living it.

Andrew Dalton
Andrew Dalton
11 months ago

nVidia have specialised in designing PCBs that can perform enormous quantities of parallel floating point calculations. That is not the forte of a CPU, which are typically “all rounders” and much more easily programmed.

Video games, at least those with real time rendered 3D graphics require vast amounts of mathematical computation. 25 years ago, a number of companies were competing in this fairly new industry (with some overlap with visual FX for movies and TV). nVidia were more or less the only real survivor from that period, although ATI merged with AMD. Intel had no comparable capabilities at the time.

As it turns out, the technology has progressed to such an extent that modern GPUs are effectively super computers. The original technology specialising in rendering 3D images has evolved to the point that it may be adapted to machine learning, CAD and scientific and engineering modelling. It can perform these tasks orders of magnitude faster than a classic CPU.

Notwithstanding whether AI will be force for good or ill (in my view, probably both, like most technologies although I remain pessimistic), the hardware nVidia produce is of fundamental use and painting the company as somehow dubious because of its contribution to gaming and AI like some kind of crack dealer is ridiculous.

Andrew Dalton
Andrew Dalton
11 months ago

nVidia have specialised in designing PCBs that can perform enormous quantities of parallel floating point calculations. That is not the forte of a CPU, which are typically “all rounders” and much more easily programmed.

Video games, at least those with real time rendered 3D graphics require vast amounts of mathematical computation. 25 years ago, a number of companies were competing in this fairly new industry (with some overlap with visual FX for movies and TV). nVidia were more or less the only real survivor from that period, although ATI merged with AMD. Intel had no comparable capabilities at the time.

As it turns out, the technology has progressed to such an extent that modern GPUs are effectively super computers. The original technology specialising in rendering 3D images has evolved to the point that it may be adapted to machine learning, CAD and scientific and engineering modelling. It can perform these tasks orders of magnitude faster than a classic CPU.

Notwithstanding whether AI will be force for good or ill (in my view, probably both, like most technologies although I remain pessimistic), the hardware nVidia produce is of fundamental use and painting the company as somehow dubious because of its contribution to gaming and AI like some kind of crack dealer is ridiculous.

Peter B
Peter B
11 months ago

Not surprised by this frankly ignorant anti-tech rant.
nVidia’s success is far from an overnight thing – it’s taken 30 years and is not an accident. What they do is incredibly difficult and they are the best in the world at it. The idea that China can just come along and copy it is laughable.
The author is probably confusing the fake Silicon Valley companies – think Theranos – with the real ones like nVidia.
His idea that Intel is somehow “better” than nVidia because it makes it’s own chips is equally ludicrous.
#1 No one makes advanced chips in the Valley any more – it’s far too expensive
#2 Intel makes some of its chips outside the US
#3 Intel makes a good – and increasing – proportion of its chips at TSMC too (just like nVidia)
It’s almost as if this author hasn’t read Adam Smith on the division of labour. Of course Silicon Valley (which used to be the centre of chip manufacturing 50 years ago) now specialises in the highest value activities. It wouldn’t exist if it didn’t.

Last edited 11 months ago by Peter B
Andrew Dalton
Andrew Dalton
11 months ago
Reply to  Peter B

Thumbs up. I posted some similar stuff but apparently I’m in approval purgatory.

Andrew Dalton
Andrew Dalton
11 months ago
Reply to  Peter B

Thumbs up. I posted some similar stuff but apparently I’m in approval purgatory.

Peter B
Peter B
11 months ago

Not surprised by this frankly ignorant anti-tech rant.
nVidia’s success is far from an overnight thing – it’s taken 30 years and is not an accident. What they do is incredibly difficult and they are the best in the world at it. The idea that China can just come along and copy it is laughable.
The author is probably confusing the fake Silicon Valley companies – think Theranos – with the real ones like nVidia.
His idea that Intel is somehow “better” than nVidia because it makes it’s own chips is equally ludicrous.
#1 No one makes advanced chips in the Valley any more – it’s far too expensive
#2 Intel makes some of its chips outside the US
#3 Intel makes a good – and increasing – proportion of its chips at TSMC too (just like nVidia)
It’s almost as if this author hasn’t read Adam Smith on the division of labour. Of course Silicon Valley (which used to be the centre of chip manufacturing 50 years ago) now specialises in the highest value activities. It wouldn’t exist if it didn’t.

Last edited 11 months ago by Peter B
Alan Gore
Alan Gore
11 months ago

But at the same time, Taiwan is rapidly building new semiconductor manufacturing capacity in the US. Every time I pass the building Phoenix Loop 303 interchange on I-17, I see the gigantic pair of TSMC buildings, also under construction, rising higher against the sky with every passing week. Nvidia will have no trouble sourcing for its GPUs and VPUs when Taiwan falls. In fact, the Pentagon has already gone public with plans to destroy the TSMC fabs there when that happens, leaving the Chicoms with an empty shell.

Last edited 11 months ago by Alan Gore
Alan Gore
Alan Gore
11 months ago

But at the same time, Taiwan is rapidly building new semiconductor manufacturing capacity in the US. Every time I pass the building Phoenix Loop 303 interchange on I-17, I see the gigantic pair of TSMC buildings, also under construction, rising higher against the sky with every passing week. Nvidia will have no trouble sourcing for its GPUs and VPUs when Taiwan falls. In fact, the Pentagon has already gone public with plans to destroy the TSMC fabs there when that happens, leaving the Chicoms with an empty shell.

Last edited 11 months ago by Alan Gore
Peter Kwasi-Modo
Peter Kwasi-Modo
11 months ago

In the development of new computer hardware, most new-design components are more energy-efficient.than the components they replace in the latest hardware offerings. The big exception to this is the GPU, i.e. the component which Nvidia produces. The GPU model they rleased in 2021 consumes 350W under load, whereas their 2022 model consumes 450W under load, i.e. about half an electric fire. So an AI application, such as training a large language model, consumes several 100,000kWh’s of energy. But there is also a problem with computer games software. They tend to keep the graphics refresh rate high even when the gamer is just looking at a menu, so the Nvidia GPU continues to operate at near full-stretch.
Neither AI nor games are as bad as bitcoin in this respect.

Andrew Dalton
Andrew Dalton
11 months ago

They are more energy efficient when measured by operations per watt. The problem is that “Moore’s Law is dead” and we’re not seeing the gains we used to (double performance every 18 to 24 months), which is also true in the CPU market.

nVidia are trying to give the performance boost by throwing more silicon at the problem with bigger cards. So a generational 50% performance boost may be made up from a ~20% more efficiency from transistor shrinking and optimisations and ~20% larger PCB.

I agree with you that the situation is getting out of hand though. I’ve taken to frame rate locking anything that doesn’t high FPS to keep my electricity bill down, and it’s actually noticeable! I’ve also long since turned the radiator off in the room with my PC.

Peter Kwasi-Modo
Peter Kwasi-Modo
11 months ago
Reply to  Andrew Dalton

The problem is the fault of the games software designers, not nVidia. The games software folks could economise on energy by adjusting the frame refresh rate according to the current usage in the game. I set this as a project for computer science students. I got them to monitor power consumption whilst they played their favourite computer games. For each of the games tested, the results were the same. The more powerful the graphicscard, the higher the frame refresh rate, much faster than the screens can handle.

Andrew Dalton
Andrew Dalton
11 months ago

Sounds like an interesting project. I was always aware of this coming from a background in computer science and image rendering, but you only really understand when you need to pay your own electricity bills!
Low rates of on screen motion do not require high rates of FPS – things like strategy games could easily get by with 30/40FPS. Basically anything that isn’t fast moving/first person.

Andrew Dalton
Andrew Dalton
11 months ago

Sounds like an interesting project. I was always aware of this coming from a background in computer science and image rendering, but you only really understand when you need to pay your own electricity bills!
Low rates of on screen motion do not require high rates of FPS – things like strategy games could easily get by with 30/40FPS. Basically anything that isn’t fast moving/first person.

Peter Kwasi-Modo
Peter Kwasi-Modo
11 months ago
Reply to  Andrew Dalton

The problem is the fault of the games software designers, not nVidia. The games software folks could economise on energy by adjusting the frame refresh rate according to the current usage in the game. I set this as a project for computer science students. I got them to monitor power consumption whilst they played their favourite computer games. For each of the games tested, the results were the same. The more powerful the graphicscard, the higher the frame refresh rate, much faster than the screens can handle.

Andrew Dalton
Andrew Dalton
11 months ago

They are more energy efficient when measured by operations per watt. The problem is that “Moore’s Law is dead” and we’re not seeing the gains we used to (double performance every 18 to 24 months), which is also true in the CPU market.

nVidia are trying to give the performance boost by throwing more silicon at the problem with bigger cards. So a generational 50% performance boost may be made up from a ~20% more efficiency from transistor shrinking and optimisations and ~20% larger PCB.

I agree with you that the situation is getting out of hand though. I’ve taken to frame rate locking anything that doesn’t high FPS to keep my electricity bill down, and it’s actually noticeable! I’ve also long since turned the radiator off in the room with my PC.

Peter Kwasi-Modo
Peter Kwasi-Modo
11 months ago

In the development of new computer hardware, most new-design components are more energy-efficient.than the components they replace in the latest hardware offerings. The big exception to this is the GPU, i.e. the component which Nvidia produces. The GPU model they rleased in 2021 consumes 350W under load, whereas their 2022 model consumes 450W under load, i.e. about half an electric fire. So an AI application, such as training a large language model, consumes several 100,000kWh’s of energy. But there is also a problem with computer games software. They tend to keep the graphics refresh rate high even when the gamer is just looking at a menu, so the Nvidia GPU continues to operate at near full-stretch.
Neither AI nor games are as bad as bitcoin in this respect.

Nicky Samengo-Turner
Nicky Samengo-Turner
11 months ago

superb!

Nicky Samengo-Turner
Nicky Samengo-Turner
11 months ago

superb!