Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
-----
--- 597104074
Americans destroyed Europe forever.
--- 597104588
>>597104074
>eternal anglo
--- 597106161
>>597104074
All this colonial power shit is so fucking stupid. By definition, if you were colonizing, it's not your territory.
--- 597106165
>>597104074
you mean we're just taking what we want?
--- 597106298
>>597104074
>we're just taking what we want
--- 597106304
>>597106161
>By definition, if you were colonizing, it's not your territory
I think you already know about the OpenAI autism towards their own creations being too powerful to release (another Elon Musk production). But I read about a more radically open competitor recently, probably on KF. I will try to find it.NGL i'd love a blantant rip off of GPT3 without the obnoxious content filters. basically Tay 2.0 by limiting its responses the way they do it limits the ability to realize it as a real person. unless they want everyone to pretend all AI is southern baptist old ladies its bizarre to have someone that doesn't curse or swear or say sexual things or use offensive language. especially in urban areas.
imagine a 4chan-based AI designed to pretend to be a jew or black.
its the one that AI Dungeon uses now right? say what you will about OpenAI their budget and time means they are probably much further ahead than any other competitor, the next best is probably only getting as much use because of Microsoft buying up exclusive rights to GPT3I think you already know about the OpenAI autism towards their own creations being too powerful to release (another Elon Musk production). But I read about a more radically open competitor recently, probably on KF. I will try to find it.
Here is what I was thinking of: EleutherAI. And the post I got it from:its the one that AI Dungeon uses now right? say what you will about OpenAI their budget and time means they are probably much further ahead than any other competitor, the next best is probably only getting as much use because of Microsoft buying up exclusive rights to GPT3
Article reads like an advertisement for OpenAI, written by Tyler McVicker, and with a healthy dose of copium over PaLM and a number of other LLMs dwarfing GPT-3 DaVinci in raw size.
No, the scaling hypothesis isn't dead, far from it. Yes, DeepMind's discovery that all current models are grossly undertrained is fucking huge. (Whitepaper attached.) It was also a paper published a few weeks ago, so there's no way it came up in that private chat with OAI 7 months ago, so it's him wishcasting (again) that OAI is anywhere near ready to leverage that finding and field a model with 4x the tokens used to train DaVinci. Not to mention that training at this scale is time consuming as fuck, so even if they were ready with a fresh, 4x larger dataset, right this minute, GPT-4 wouldn't be done training before Q3 this year at the earliest, so it's not going to be ready any time now. I'm not even going to bother with his previous retarded hype hallucination that GPT-4 would be a 100T parameter model, it's too stupid.
Also, he conveniently left off mentioning a number of actually open source LLM projects that are ongoing right now that are making rapid progress on democratising access to this stuff, namely EleutherAI and the BigScience team. EleutherAI has a number of pretrained models fully available to the public already (which can be run on local hardware if you're crazy enough, or run for free via GoogleCollab notebooks using the KoboldAI frontend and scripting), and BigScience is in the process of training a 176B parameter model that is directly comparable to GPT-3 DaVinci on the Jean Zay supercomputer in France. When it's done (should be July-Sept if things continue to go well), it will be released to the public for all to enjoy. (You can track their progress HERE, if you're like me and find load bars exciting.) Meanwhile, the PyTorch and DeepSpeed teams are working on a number of software improvements that relate to all this, that have the potentially to drastically reduce how much VRAM is needed to run these huge transformers.
Bottom line is, sellout Sam Altman can lick my DB-25 port -- his crew is no longer the cutting edge, and their monopoly's days are numbered.
After a year-long odyssey through months of chip shortage-induced shipping delays, technical trials and tribulations, and aggressively boring debugging, we are happy to finally announce EleutherAI’s latest open-source language model: GPT-NeoX-20B, a 20 billion parameter model trained using our GPT-NeoX framework on GPUs generously provided by our friends at CoreWeave.
GPT-NeoX-20B is, to our knowledge, the largest publicly accessible pretrained general-purpose autoregressive language model, and we expect it to perform well on many tasks.
We hope that the increased accessibility of models of this size will aid in research towards the safe use of AI systems, and encourage anyone interested in working in this direction to reach out to us.
Here at EleutherAI, we are probably most well known for our ongoing project to produce a GPT-3-like very large language model and release it as open source. Reasonable safety concerns about this project have been raised many times. We take AI safety extremely seriously, and consider it one of the, if not the most important problem to be working on today. We have discussed extensively the risk-benefit tradeoff (it’s always a tradeoff), and are by now quite certain that the construction and release of such a model is net good for society, because it will enable more safety-relevant research to be done on such models.
It is very unclear if and when such models will start to exhibit far more powerful and dangerous capabilities. If we had access to a truly unprecedentedly large model (say one quadrillion parameters), we would not release it, as no one could know what such a system might be capable of.
its the one that AI Dungeon uses now right? say what you will about OpenAI their budget and time means they are probably much further ahead than any other competitor, the next best is probably only getting as much use because of Microsoft buying up exclusive rights to GPT3