The Core

Why We Are Here => Water Cooler => Topic started by: rcjordan on December 28, 2024, 01:25:52 AM

Title: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: rcjordan on December 28, 2024, 01:25:52 AM

https://www.theguardian.com/technology/2024/dec/27/godfather-of-ai-raises-odds-of-the-technology-wiping-out-humanity-over-next-30-years
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: Travoli on December 28, 2024, 02:34:30 AM
What's the over/under on UFOs beating AI to the punch?
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: rcjordan on December 28, 2024, 04:26:04 AM
>UFOs

HHH! I'm filtering on those and that NJ mass psychosis is still getting through.

That said, those Langley pilots' reports about encounters in restricted airspace *did* catch my attention.
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: littleman on December 28, 2024, 09:45:48 PM
>NJ mass psychosis

We've seen this before.

>UFOs

Odds of life intelligent life existing elsewhere in the universe is very likely imo, but the odds of us getting a visit from something out there are extremely low.  I don't think people realize how far distances in space are between solar systems.   

I just looked up the distance to the nearest star (Proxima Centauri):
4.24 light-years ~ 40 trillion kilometers ~ 25 trillion miles

You mix the distance with the odds of having intelligent life at the same time and the likelihood is really low.  That's unless our understanding of physics are really primitive.
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: ergophobe on December 31, 2024, 01:01:32 AM
These kinds of articles drive me nuts.

This is like asking Oppenheimer or Teller what the odds are that a nuclear bomb will destroy humanity in the next 30 years and having them put a number on it. Frankly, if you had asked me in 1958, I would have said 80% chance (and if not for Petrov in 1983 and Arkhipov in 1962, it almost certainly would have happened - they should be more celebrated).

Also, "wiping out humanity" is different from "destroying civilization." This chapter by Stewart Brand is worth a read

https://books.worksinprogress.co/book/maintenance-of-everything/communities-of-practice/unending-world/1

And finally, the assertion that when intelligent beings encounter less intelligent beings, the outcome is always catastrophic for the less intelligent beings is simply not borne out by reality. Ants, cockroaches, squirrels, rats, coyotes and many other animals thrive with humans around. And, yes, we have driven many animals extinct even before we had bows and arrows. But that is not an inevitable outcome of an encounter between humans and a less intelligent species.

I'm not saying it *couldn't* happen, just that assigning a numerical probability to this in the presence of such uncertainty is meaningless. You might as well as a random number generator. It would be just as valid.

I generally hate it when people pretend to know stuff when they have no clue. I make an exception for myself though.
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: buckworks on December 31, 2024, 03:22:38 AM
>> chapter by Stewart Brand

I HATE IT when people say that Y2K was an overblown concern. Stewart Brand should know better! Y2K disruptions were a very real threat, but we made it through because countless IT folks worked countless hours to update countless computer systems. The fact that we ended up with scattered problems but not major upheaval does NOT mean the threat was imaginary! It means that the right people understood the issues and took effective action to head off problems before they happened. Alas, there's not much glory in that!

There's a paradox in there that probably deserves a name ...
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: ergophobe on December 31, 2024, 08:21:25 PM
I think you're forgetting the hysteria of the doom scenarios around Y2K, which were indeed invented. I put them up there with AOC saying that humanity could be wiped out in 10-15 years by climate change. They were just absurd. If there is a criticism of Brand, it's that he was way too far on the side of the doomsayers for Y2K.

https://x.com/stewartbrand/status/1847320087335723044?mx=2

In any case, I don't think that changes the fact that Hinton ultimately has no idea whether there is a 99% chance of AI destroying human civilization or a 0.000001% chance. There are too many unknowns and too many variables. It's this sort of "guessing with numbers" that makes people increasingly skeptical of experts.

A trustworthy expert in the field would say, "I have no idea."

I remember noticing this when my grad school was interviewing new professor candidates. There were basically two types. One thinks that they need to answer every question and they throw out hypotheticals because to say, "I don't know," makes them appear to lack knowledge. The other says, "I don't know," freely because, well, they don't know. Typically it was the second type of candidate that got the job.

This was especially clear in the case of two candidates who had gone to the same undergrad and the same grad school and had studied similar topics under the same professors at both schools. One did a lot of unsatisfying guessing and the other simply said, "I don't know. Next question." Everyone was super impressed by the second candidate. I told one of the professors, "If that's the standard, I'll never get a job." He replied, "If that were the standard, I wouldn't have this job."

People like that make excellent scholars and very poor clickbait headlines.
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: buckworks on December 31, 2024, 11:34:30 PM
>> forgetting the hysteria

I remember concern but don't remember hysteria. Maybe that's Canadian vs US media? Or maybe I truly am forgetting.

I do remember that the IT staff at the college where I worked at the time burned a lot of midnight oil to Y2K-proof the college systems.

One of the ITs said: "If we solve a problem, people are grateful. If we think ahead and prevent the problem, no one notices."

>> guessing with numbers

87% of the people who spout statistics make them up as they go along.

I agree with you that pronouncing probabilities can venture into the absurd. Nonetheless it's often reasonable to predict that if Trend X continues, we can expect more of Y, even if it's an overreach to predict exact times or places where Y would show itself.
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: ergophobe on January 01, 2025, 10:37:53 PM
>> don't remember hysteria

I think you have the bad habit of surrounding yourself with reasonable and knowledgeable people  :)

The hysteria was mostly from preppers, evangelical Christians, grifters and supermarket tabloids (remember those!).

QuoteThe New York Times reported in late 1999, "The Rev. Jerry Falwell suggested that Y2K would be the confirmation of Christian prophecy – God's instrument to shake this nation, to humble this nation. The Y2K crisis might incite a worldwide revival that would lead to the rapture of the church. Along with many survivalists, Mr. Falwell advised stocking up on food and guns"....

https://en.wikipedia.org/wiki/Year_2000_problem#Fringe_group_responses

>> to Y2K-proof the college systems

It is genuinely one of the great infrastructure maintenance success stories of all time, but planes were never going to suddenly drop from the sky.

>> reasonable to predict that if Trend X continues, we can expect more of Y,

Absolutely. I'm not saying that a self-aware system designed to be fair and just would decide humans are the most dangerous species on the planet and wipe us out. I'm just saying there is no trend to guide a prediction and, in all likelihood, once that trend data is available to study, it's too late. Again, like predicting how many nuclear wars we will have in the 21st century.

BTW, did you realize that half of all nuclear weapons used in history were used on the wrong target, most likely against direct orders? And did you realize that half of all orders to launch nukes were disobeyed by those responsible for the launch?
Title: Re: ‘Godfather of AI’ shortens odds of the technology wiping out humanity: next 30yr
Post by: buckworks on January 02, 2025, 06:58:38 AM
>> reasonable and knowledgeable people

My favourite kind!