- Joined
- Feb 17, 2024
Some of them are what one might call “gangster stalkers”, yes.gangsters
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Some of them are what one might call “gangster stalkers”, yes.gangsters
i have been programming in c language for 50++ yearsmost of this forum, maybe all of it, is just keyboard gangsters, children.
its ok.i have been programming in c language for 50++ years
while (cash > 0) {
cash++;
}
bitches can not swim in deep waters.while (cash > 0) { cash++; }
This code can be configured to run on any currency
O'Contraire I have seen FINE HOUNDS swimming in CRYSTAL CLEAR WATERS. Their offspring were drowned.bitches can not swim in deep waters.
trades are openO'Contraire I have seen FINE HOUNDS swimming in CRYSTAL CLEAR WATERS. Their offspring were drowned.
he cant even get his sexuality straightcan you keep a straight story?
i studied counseling.he cant even get his sexuality straight
knowing you, you’d rape the babyi studied counseling.
dealing with bitches like you is like stealing candy from a baby.
you dont know me.knowing you, you’d rape the baby
islam really does allow child brides… in this video, does she look at him as tho he is an older brother or an uncle? no, she looks at him with the love of a bride.
to me, she looks about ten years old maybe twelve.
i know that seems funny to you, but it isn't.are we close to winning the nobel prize for economics yet
yep, that was me.
@Prokhor Zakharov is kind of a faggot, isn’t he?omg, here is my number one groupie.
prokhor
to call it a faggot would be a compliment.@Prokhor Zakharov is kind of a faggot, isn’t he?
one tiny trade that consists of a single 0.01 lot trade.Are trades open?
other than the fact that AI scares the crap outta me, i enjoy using chatgpt and that is the extent of my use or interest of AI.What do you think about the developments in AI? do you have interest in recent developments like the transformer or attention mechanism used for language models?
i dont know what transformer or attention-mechanism is.What do you think about the developments in AI? do you have interest in recent developments like the transformer or attention mechanism used for language models?
within a neural network you have layers, these layers are basically just a big dot product against some learned weights (simple multiplication) and then run through whats called an "activation function", which causes layers to have a desired non-linearity between them, its basically what allows a neural network to think in a meaningful way.i dont know what transformer or attention-mechanism is.
maybe you can give me a primer on it.
AI scares me.. give it about five years and you might see unexpected things coming from AI.. things that you might not like.within a neural network you have layers, these layers are basically just a big dot product against some learned weights (simple multiplication) and then run through whats called an "activation function", which causes layers to have a desired non-linearity between them, its basically what allows a neural network to think in a meaningful way.
A transformer is a specific kind of "neural block", which is just the composition of a couple of layers doing different things, composed into a logical grouping. this transformer block does a couple of things, but it boils down to using a specific kind of layer in a neural network, called an :attention" layer. The attention layer, in a nutshell, makes what you might think of as a scoring of correlation for each datapoint in the attention mechanism, so if the attention layer has say, 5 inputs, it would produce something like 25 outputs, where each individual input has its 'correlation score' with the other 5 outputs (including itself). Im being vague on the idea of correlation here, since really its a learned scoring system and not correlation, but its simpler to describe that way.
Attention makes it so when a model is reading a sentence, it can determine what parts of the earlier parts of the sentence (the context) matters when reading a particular word. for instance if I say the word "server", it could mean a computer, or a waiter; if I say something like "the server crashed", its pretty different in meaning to "the server arrived with a bill". Your mind automatically determined what "server" meant here from the context of the sentence. that approximately what attention does.