Yes. AI is F*cking scary. Pause a second and try to think
Not a popular opinion, but I think we're screwed. Even moreso because nobody seems to be aware or even care about AI alignment, nor realize how difficult this task really is. Hey, I'm a pretty optimistic person and I hate victim mentality, I think conspiracies are dumb and an excuse to not take action, but
this sh*t is serious.
I'm trying to raise awareness about this, but it's like nobody "gets it", as evidenced in threads like this. If you were told that an incredibly powerful alien civilization was going to land on earth for 2 years, you'd be like "damn we should be prepared!". This doesn't look like it at all. Why? Because we don't
see it. It's "something on the internet".
It looks more like we're winging it, to be honest. Listening to Lex Fridman with Sam Altman podcast was disappointing really. Is this the level of conversation that's happening between the most brilliant minds? To me, everything sounds so pretentious and surface level... and
irresponsible. You can't bet the future of humanity on some vague intuition I'm not being arrogant, there are many people way more intelligent than me, but they aren't necessarily in the mainstream media or academia.
I recommend you guys listen to
Eliezer Yudkowsky interview with Fridman. That's what a smart dude that has been deeply thinking about this issue for years sounds like.
All of this talk about ai being
"just a tool" that
"helps humans do certain tasks better" is nonsense. GPT4 outperforms most humans in standardized tests, and it's already showing signs of proto AGI (being able to self-improve, performing tasks that it wasn't trained for). And this is only the GPT4.
I don't even want to know what GPT5 will look like.
This is progressing geometrically. Make the math. And it will be no time until open-sourced gpt-4 like are available to a mass audience, meaning, someone with bad intentions or no concern about safety can get access to the source code of something capable of creating human-like content...
It's funny how people get triggered about stuff like covid, and gender pronouns (be it one side of the political spectrum or another). But this? Something that poses the threat of extinction of the human race? Something 100000 times worse?
Nah dude. I guess it will sort itself out. These guys know what they're doing... They are famous authorities and I see them on tv, so they can't be deranged psychopaths or clueless about what they are talking about.
Guys, I don't care that you don't care. Neither does AI care. I know you want to keep going on with your life.
Me too. I'm horrified about living in this era. This isn't exciting to me
. I want to live a normal life, be able to make money, date, and make friends like a normal person. And I'm hurt by the apathy of the general public.
I really want to think it's just a lack of awareness, and once people have the "aha!" moment they will care. And that's why I'm writing this
And this is not only about job loss, drastic social change that we aren't prepared for, general chaos (misinformation, cyber attacks), bias or an increase in the wealth gap. No.
I know this sounds out there, but hear me out. We're summoning an outer dimensional being that not even the best minds that are working on it right now understand. We don't know what's inside the box! This is basically an alien that "thinks" in nonhuman ways that produces a convincingly human output. So we think it "cares".
Nope. It was trained to behave that way.
Think of it like this:
-Does it make sense to deploy a dangerous technology, that could potentially cause great harm to sentient life, before being 100% sure it will work out the first time? No.
So why are we doing that with something more dangerous than nuclear weapons?
-Intelligence was what allowed us to conquer other species, even make them extinct. Technological advancement is what made a few European invaders annihilate an entire civilization in America.
-Do you think controlling an entity potentially thousands of times smarter than us such that it aligns with human values is an easy task? Do you think corporations and politicians monopolizing AI is the biggest threat?
-Do you really think a superintelligent being is NOT going to find a way to k*ll us all very, very fast if we in some way are an obstacle to its
goal (it doesn't even have to be a particularly "evil" AI, as exemplified in the paperclip maximizer metaphor)?
This isn't a conspiracy theory.
This is something experts in the field think is a very legitimate risk.
So the most probably fastlane action you can take, if you care about your children living a normal life on earth (heck,
just living at all) is to raise awareness among your friends and family about this so we can raise consensus about this.
This is the most important problem we've ever faced.
We need to stop this madness. How? By shutting it down indefinitely
Please, I beg you. Don't bury your head in the sand. NOBODY wins if we keep going on with this. Not even big pharma, big corporations, politicians, the elite or whatever conspiracy you believe in. It isn't a matter of politics. It's f*cking common sense. Children that did nothing wrong are going to suffer.
Just so I can get to you, Fastlane nerds. If there's one DARE event, this is it!!!
One action that has disproportionately asymmetric returns, that erases anything good that humanity did before. All of that sacrifice by previous generations, made futile. Just so we could voluntarily create a technology that can k*ll us all.
But hey! We can laugh at the absurdity of being able to make a machine talk in the King James Bible style about removing a peanut butter sandwich from a VCR. How fun!
So an action with asymmetric positive returns is to take measures to prevent this. Share the article above with your friends. Let's install the idea that AI sucks, specially if we rush towards it like morons.