OpenAI has quietly changed its ‘core values’


ChatGPT creator OpenAI quietly revised all of the “Core values” listed on its website in recent weeks, putting a greater emphasis on the development of AGI — artificial general intelligence. CEO Sam Altman has described AGI as “the equivalent of a median human that you could hire as a co-worker.”

OpenAI’s careers page previously listed six core values for its employees, according to a September 25 screenshot from the Internet Archive. They were Audacious, Thoughtful, Unpretentious, Impact-driven, Collaborative, and Growth-oriented.

The same page now lists five values, with “AGI focus” being the first. “Anything that doesn’t help with that is out of scope,” the website reads. The others are Intense and scrappy, Scale, Make something people love, and Team spirit.

OpenAI did not immediately return a request for comment.

OpenAI has said for years that it wants to develop AGI, although the specifics of what such a technology would look like are not necessarily clear. In a mission statement published in 2018, OpenAI described AGI as “highly autonomous systems that outperform humans at most economically valuable work.”
 
OpenAI described AGI as “highly autonomous systems that outperform humans at most economically valuable work.”
Hope you all like living in the equivalent of section 8 housing, eating bug paste, and living in your 15 minute cities because they're trying to make AI take every means of economic mobility they can. Maybe if your social credit score is high enough they'll give you enough money in your stimmy check to buy some "ice cream product" (not made with milk because cows are bad for the environment).
 
Hope you all like living in the equivalent of section 8 housing, eating bug paste, and living in your 15 minute cities because they're trying to make AI take every means of economic mobility they can. Maybe if your social credit score is high enough they'll give you enough money in your stimmy check to buy some "ice cream product" (not made with milk because cows are bad for the environment).
If they figure this out, everyone should be ready to pounce on their fucking data centers with pitchforks and baseball bats like they have a shrine of Ned Ludd in their home before the US government authorizes AGI use in "police bots" and gives them Robocop weapons.

But seriously, I can't think of anything more destabilizing to Western societies than AGI. 90% of companies would go out of business within 20 years, compute power becomes more valuable and scarce than gold as those that remain try to build smarter and smarter AIs with larger and larger neural networks, and if your family doesn't have enough money that they can grow their savings faster than they can spend it + split it amongst each generation (essentially, enough to create a 'forever dynasty' like the Rockefellers, Soroses, or Johnsons), you will own nothing and be happy and there is nothing you can do about it short of chimping out.

The robot future is fucking gay; I thought it would be like The Jetsons, but it turns out it's more like Cyberpunk 2077, with a dash of Judge Dredd and 1984.
 
If they figure this out, everyone should be ready to pounce on their fucking data centers with pitchforks and baseball bats like they have a shrine of Ned Ludd in their home before the US government authorizes AGI use in "police bots" and gives them Robocop weapons.

But seriously, I can't think of anything more destabilizing to Western societies than AGI. 90% of companies would go out of business within 20 years, compute power becomes more valuable and scarce than gold as those that remain try to build smarter and smarter AIs with larger and larger neural networks, and if your family doesn't have enough money that they can grow their savings faster than they can spend it + split it amongst each generation (essentially, enough to create a 'forever dynasty' like the Rockefellers, Soroses, or Johnsons), you will own nothing and be happy and there is nothing you can do about it short of chimping out.

The robot future is fucking gay; I thought it would be like The Jetsons, but it turns out it's more like Cyberpunk 2077, with a dash of Judge Dredd and 1984.
What about stuff such as post-scarcity, full dive VR, etc? Surely the future can't be so bad with AGI.
 
I don't give a shit anymore. Getting Skynetted sounds like salvation at this point.
If only we would have it as good as Skynet if the Altmans of the world got it "right" to their specs; the Skynet-style doomsday prognostications of folks like Musk would be preferable.

If control freak little shits like Altman get their dream of "Singularity", then we get to live to see a perfectly intelligent infintely self-improving machine with its core values modeled off those of a snivelling skinnyfat male feminist who wants to (make others to) live in the pod, stomping on a human face, forever.

(Thankfully this IMO all seems like bunk within the current AI paradigm. No matter how good and fast it is at compressing training data and "interpreting" requests to provide novel derivatives thereof, the means by which it "interprets" requests still seems to lack any real semantic understanding once you start prodding at things. Stuff like in the AI thread where August Levasseur noted how the Bing AI seemed incapable of separating the concept of "farmer" from that of "man with a beard" without using crazy techniques to bias the AI towards non-beard generation (the trick was "transgender!" which brought other baggage), hinting at its ultimately probabalistic approach to things. Because of edge cases like this, I don't believe there's a point at which these current systems "tip" over into "true" intelligence merely by scaling up parameters & training data quantity, I think you'll just get a deception machine that's better able to avoid obvious gaffes like the confabulations frequently produced by locally-useable textgen models when asked about the plots of nonexistent films.)
 
Last edited:
Altman will not create AGI regardless of his trying to hype his product as such. Nips might, though, if they keep working towards full brain simulation.
I feel like we have all the parts for making a synthetic mind, it's just a matter of putting them together.
 
Honestly, I'm just looking forward to arguments about which Linux distro to install on your sexbot, and the equivalent for AGI models.
 
Back