Close Menu
  • Home
  • Financial
  • News
  • Personal Finance
  • Real Estate
  • Debt Relief
  • Subscribe Now
What's Hot

‘Peak war panic’ will likely hit financial markets in 1-3 weeks, strategist predicts, as Trump says he doesn’t want to make a deal with Iran yet | Fortune

March 14, 2026

U.S. hits military targets on Iran’s Kharg Island as war escalates | Fortune

March 14, 2026

When Will House Prices Go Down?

March 13, 2026
Facebook X (Twitter) Instagram
creditreddit.org
Subscribe Now
  • Home
  • Financial
  • News
  • Personal Finance
  • Real Estate
  • Debt Relief
  • Subscribe Now
creditreddit.org
Home » Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control | Fortune
Financial

Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control | Fortune

joshBy joshOctober 22, 2025No Comments3 Mins Read0 Views
Facebook Twitter Pinterest Telegram LinkedIn Tumblr Copy Link Email
Follow Us
Google News Flipboard
Prince Harry, Richard Branson, Steve Bannon, and ‘AI godfathers’ call on AI labs to halt their pursuit of ‘superintelligence’—warning the technology could surpass human control | Fortune
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link



A new open letter, signed by a range of AI scientists, celebrities, policymakers, and faith leaders, calls for a ban on the development of “superintelligence”—a hypothetical AI technology that could exceed the intelligence of all of humanity—until the technology is reliably safe and controllable.

The letter’s more notable signatories include AI pioneer and Nobel laureate Geoffrey Hinton, other AI luminaries such as Yoshua Bengio and Stuart Russell, as well as business leaders such as Virgin cofounder Richard Branson and Apple cofounder Steve Wozniak. It was also signed by celebrities, including actor Joseph Gordon-Levitt, who recently expressed concerns over Meta’s AI products, will.i.am, and Prince Harry and Meghan, Duke and Duchess of Sussex. Policy and national security figures as diverse as Trump ally and strategist Steve Bannon and Mike Mullen, chairman of the Joint Chiefs of Staff under Presidents George W. Bush and Barack Obama, also appear on the list of more than 1,000 other signatories.

New polling conducted alongside the open letter, which was written and circulated by the nonprofit Future of Life Institute, found that the public generally agreed with the call for a moratorium on the development of superpowerful AI technology.

In the U.S., the polling found that only 5% of U.S. adults support the current status quo of unregulated development of advanced AI, while 64% agreed superintelligence shouldn’t be developed until it’s provably safe and controllable. The poll found that 73% want robust regulation on advanced AI.

“95% of Americans don’t want a race to superintelligence, and experts want to ban it,” Future of Life president Max Tegmark said in the statement.

Superintelligence is broadly defined as a type of artificial intelligence capable of outperforming the entirety of humanity at most cognitive tasks. There is currently no consensus on when or if superintelligence will be achieved, and timelines suggested by experts are speculative. Some more aggressive estimates have said superintelligence could be achieved by the late 2020s, while more conservative views delay it much further or question the current tech’s ability to achieve it at all.

Several leading AI labs, including Meta, Google DeepMind, and OpenAI, are actively pursuing this level of advanced AI. The letter calls on these leading AI labs to halt their pursuit of these capabilities until there is a “broad scientific consensus that it will be done safely and controllably, and strong public buy-in.”

“Frontier AI systems could surpass most individuals across most cognitive tasks within just a few years,” Yoshua Bengio, Turing Award–winning computer scientist, who along with Hinton is considered one of the “godfathers” of AI, said in a statement. “To safely advance toward superintelligence, we must scientifically determine how to design AI systems that are fundamentally incapable of harming people, whether through misalignment or malicious use. We also need to make sure the public has a much stronger say in decisions that will shape our collective future,” he said.

The signatories claim that the pursuit of superintelligence raises serious risks of economic displacement and disempowerment, and is a threat to national security as well as civil liberties. The letter accuses tech companies of pursuing this potentially dangerous technology without guardrails, oversight, and without broad public consent.

“To get the most from what AI has to offer mankind, there is simply no need to reach for the unknowable and highly risky goal of superintelligence, which is by far a frontier too far. By definition, this would result in a power that we could neither understand nor control,” actor Stephen Fry said in the statement.

Bannon Branson call control Fortune godfathers halt Harry human labs Prince pursuit Richard Steve superintelligencewarning surpass technology
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Telegram Email Copy Link
josh
  • Website

Related Posts

‘Peak war panic’ will likely hit financial markets in 1-3 weeks, strategist predicts, as Trump says he doesn’t want to make a deal with Iran yet | Fortune

By joshMarch 14, 2026

U.S. hits military targets on Iran’s Kharg Island as war escalates | Fortune

By joshMarch 14, 2026

The U.S. Mint dropped the olive branch from the dime. What does that mean for the country? | Fortune

By joshMarch 12, 2026

Asia rolls out four-day weeks and work-from-home as emergency measures to solve a fuel crisis caused by Iran war | Fortune

By joshMarch 12, 2026

Oracle blows investors away with 22% ‘hyper growth’ — but cash flow crunches to negative $24.7 billion | Fortune

By joshMarch 10, 2026

America’s never had such high national debt heading into an economic shock. We need a ‘break glass’ plan, think tank warns | Fortune

By joshMarch 10, 2026
Add A Comment
Leave A Reply Cancel Reply

Top Posts

How to Build a More Predictable Financial Routine

November 24, 2025233 Views

Social Security payments to go up 2.8% next year while polls show three-fourths of seniors think 3% isn’t enough to keep up with rising prices | Fortune

October 24, 202542 Views

Trump Floats 50-Year Mortgages: Cash Flow Boost or Affordability Illusion?

November 13, 202540 Views

Why Mortgage Rates are Rising as the Fed Keeps Cutting

November 4, 202533 Views
Don't Miss

‘Peak war panic’ will likely hit financial markets in 1-3 weeks, strategist predicts, as Trump says he doesn’t want to make a deal with Iran yet | Fortune

March 14, 20265 Mins Read0 Views

The S&P 500 is only down 3% so far this year and 5% off its…

U.S. hits military targets on Iran’s Kharg Island as war escalates | Fortune

March 14, 2026

When Will House Prices Go Down?

March 13, 2026

How to Pack for a Move in 3 Days: Your Last-Minute Plan for Maximum Efficiency

March 13, 2026
Demo
Our Picks

‘Peak war panic’ will likely hit financial markets in 1-3 weeks, strategist predicts, as Trump says he doesn’t want to make a deal with Iran yet | Fortune

March 14, 2026

U.S. hits military targets on Iran’s Kharg Island as war escalates | Fortune

March 14, 2026

When Will House Prices Go Down?

March 13, 2026
Most Popular

The markets’ reaction to Trump hides a darker truth that puts the American economy at risk, Piper Sandler warns | Fortune

August 26, 20250 Views

Investors Are Controlling the Housing Market

September 4, 20250 Views

Local Politics is Ruining the American Dream With Overbearing Regulations

September 4, 20250 Views
  • Home
  • Privacy Policy
  • Terms and Conditions
  • Subscribe Now
© 2026 ThemeSphere.

Terms & Conditions | Privacy Policy

Type above and press Enter to search. Press Esc to cancel.