I hate everything about this: the lack of transparency, the lack of communication, the chaotic back and forth. We don’t know now if the company is now in a better position or worse.
I know it leaves me feeling pretty sick and untrusting about it considering the importance and potential disruptiveness (perhaps extreme) of AI in the coming years.
Same here. I like Sam Altman but if the board removed him for a good reason and he was reinstated because the employees want payouts, humanity could be in big trouble.
I actually like the chaoticness, because I don’t like having one small group of people as the self-appointed and de-facto gatekeepers of AI for everyone else. This makes it clear to everyone why it’s important to control your own AI resources.
For fuck’s sake. You want bad things to happen… so good things happen, later. Bad shit happening is the part that’s objectionable. Saying ‘but I want good things’ isn’t fucking relevant to why someone’s hassling you about this!
The bad shit you want to happen first is the only part that’s real!
I hate everything about this: the lack of transparency, the lack of communication, the chaotic back and forth. We don’t know now if the company is now in a better position or worse.
I know it leaves me feeling pretty sick and untrusting about it considering the importance and potential disruptiveness (perhaps extreme) of AI in the coming years.
Same here. I like Sam Altman but if the board removed him for a good reason and he was reinstated because the employees want payouts, humanity could be in big trouble.
I actually like the chaoticness, because I don’t like having one small group of people as the self-appointed and de-facto gatekeepers of AI for everyone else. This makes it clear to everyone why it’s important to control your own AI resources.
Accelerationism is human sacrifice. It only works if it does damage… and most of the time, it only does damage.
Not wanting a small group of self-appointed gatekeepers is not the same as accelerationism.
… the goal is not what makes it acceleration.
“Accelerationism” is a philosophical position. The goal is entirely what makes it accelerationism. Quit swapping words in each new comment.
For fuck’s sake. You want bad things to happen… so good things happen, later. Bad shit happening is the part that’s objectionable. Saying ‘but I want good things’ isn’t fucking relevant to why someone’s hassling you about this!
The bad shit you want to happen first is the only part that’s real!
No, that’s entirely you assuming things about my position. I don’t want bad things to happen.
I’m with you there, I just hope the general public come to that realization.
Just like it did with climate change?
Exactly