The Government has yet to decide on the future of HS2, but it’s not looking good for the northern leg to Manchester. Chancellor Jeremy Hunt said that the costs are “totally out of control”. It would be “crazy” not to review the situation, added Defence Secretary Grant Shapps.
The link to Leeds has already been cancelled, so if Manchester is also cut loose then HS2 will become the Birmingham Express. National humiliation beckons, not to mention the final collapse of the levelling-up agenda. However, there is one political upside for Rishi Sunak: another chance to portray himself as a taker of tough decisions. Fraser Nelson, one of the more Sunak-friendly commentators, puts it thus:
But if the PM really is our cost-cutting hero, then as far as HS2 goes he’s late to the scene. Almost a decade ago there were Conservatives calling for the scrapping of the entire project. If they’d been listened to, then years of political effort and billions of pounds could have been saved. £2.3 billion has already been spent on preparatory work for HS2 beyond Birmingham.
Spotting a white elephant while it’s trampling through the public finances isn’t difficult. The real test is stopping a misconceived mega-project before it wastes our time and money.
For instance, if Sunak is as hard-nosed as he’s cracked up to be, he should take another look at Sizewell C nuclear power station in Suffolk. As a mega-project, it’s very similar to Hinkley Point C in Somerset. Still in construction, Hinkley has been plagued by cost overruns and repeated delays.
Unbelievably, the Government’s response to this fiasco is to take a direct 50% stake in the Suffolk plant, which means betting billions of pounds of taxpayers’ money on a technology notorious for its construction risks.
Remember that only last week Sunak decided to delay the national rollout of heat pumps — a simple piece of kit that’s basically a reverse fridge. So is he going to apply his supposed rigour to the daunting complexity of nuclear fission? The cynical answer is: of course he won’t. Sizewell C is still at the planning stage, not the burning money stage. The Prime Minister can bask in the grandeur of his nuclear vision — and leave some future national leader to deal with the consequences.
Meanwhile, another white elephant lumbers into view on the fringes of West London. Sadly, Heathrow’s third runway is back on the agenda. Seemingly determined to underscore the worst planning mistake in our post-war history, ministers have already given the go-ahead.
However, there is something than Sunak can do to protect us, and that is to make it entirely clear than this privately-financed project won’t receive a penny of public funding. It’s another chance to protect the public purse — and to prove he isn’t just opportunistic.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe“… ensuring that AI systems do what we want them to do, ideally within a sound ethical framework.” Yeah, just like gain of function research into viruses. I’m afraid the toothpaste is out of the tube.
Trust the science
Right?! And just like the ethical framework we all operate within today WITHOUT the potential
hazardsbenefits of AI.Trust the science
Right?! And just like the ethical framework we all operate within today WITHOUT the potential
hazardsbenefits of AI.“… ensuring that AI systems do what we want them to do, ideally within a sound ethical framework.” Yeah, just like gain of function research into viruses. I’m afraid the toothpaste is out of the tube.
I do wish that we had a way of hardwiring Asimov’s three laws of robotics into the developing AIs.
Asimov’s stories tend to be about how the three laws are insufficient…
In his later works, Foundation’s Edge for example, the robots are very humanlike – to the point of seeming to have emotions.
In his later works, Foundation’s Edge for example, the robots are very humanlike – to the point of seeming to have emotions.
Asimov’s stories tend to be about how the three laws are insufficient…
I do wish that we had a way of hardwiring Asimov’s three laws of robotics into the developing AIs.
“Boots on the ground” as us Guardsmen say, will never be totally replaced by technology.
“Boots on the ground” as us Guardsmen say, will never be totally replaced by technology.
So the real question is not “Will AI systems be more intelligent than humans?” but “Will our AI systems be more intelligent than those of our potential adversaries?”.
So the real question is not “Will AI systems be more intelligent than humans?” but “Will our AI systems be more intelligent than those of our potential adversaries?”.
War Studies relative just back from Ukraine doing field research on drones in warfare. Very scary stuff already going on.
War Studies relative just back from Ukraine doing field research on drones in warfare. Very scary stuff already going on.
Umm, which sort of ethics are meant to guide AI alignment? The ones we deploy in animal agriculture, or the ones we appreciate in a university humanities course? Or the ones which emerged from the Enlightenment, minus the occasional slavery bit?
Umm, which sort of ethics are meant to guide AI alignment? The ones we deploy in animal agriculture, or the ones we appreciate in a university humanities course? Or the ones which emerged from the Enlightenment, minus the occasional slavery bit?