Simulated reality

Honestly? I feel like we should start with insects. I think trying to shove a simulation of a fistful of neurons and expecting them to work together (even with a "lot" of training) is absurd.

Like, might as well graft some wires to some of my neurons and see of we can beam images into my head. Oh, I know they're not hooked up right, but don't worry, we'll run training on it and eventually this black box algorithm will pick out the right connections.

That's why I believe in Minsky's philosophy that strong AI will be intentionally designed using hand-written algorithms instead of McCarthy's vision that it'll be just a big ol' neural network.
You're right, computers will never be able to simulate a human brain perfectly. That's why we need to work on their level. We don't simulate physics in games by literally simulating every atom in the universe, we simplify it into a set of algorithms that make practical sense for the hardware platform and for the application. I don't know why it's such a struggle for someone to try the same with AI. The brain is only a potential implementation of intelligence leveraging overly-complicated hardware for a lack of a software implementation. A software implementation would be so much simpler, kind of why we use tiny embedded processors running a RTOS instead of a giant room of analogue equipment now. Again, we don't simulate the physics of those analogue computers in embedded systems, we just write practically-equivalent code of what they were supposed to do.
It wouldn't be perfect, but, really just implement everything we know about psychology into code. It's not like we don't have a pretty solid understanding of how humans think and behave, just write the code to duplicate that.
I'd like to try it myself but I'm fairly busy. I mean I got theories but I don't have the time nor resources to try it now. Obviously it would kind of be a huge system. Really trying to not talk out of my ass after having said "where is the AI you keep writing philosophy about"
 
  • Like
Reactions: Marvin
Here's a question I pose to everyone on this forum, a question I know none of you will be able to answer. 'A Muslim philosopher once asked "Why is there something instead of nothing? Doesn't nothing make more sense?"'

I look forward to the answer of anyone reading this. Here's your chance, explain why we even exist.

That question is bunk. There is no nothing, even what you think is nothing is in fact something, the real question should be "why would there be nothing?" - see if you can answer that one.
 
Here's a question I pose to everyone on this forum, a question I know none of you will be able to answer. 'A Muslim philosopher once asked "Why is there something instead of nothing? Doesn't nothing make more sense?"'
The structure of reality has no obligation to cohere to our preconceptions. From our limited perspective, with no tools, it would make more sense to presume the sun moves around the Earth instead of the other way around.
EDIT: Also OP is an autisticdragonkin-tier pseud and I can't wait for his inevitable cult leader arc.
 
  • Agree
Reactions: Marvin
Hey, we bust our asses to make this experience authentic for you. What, do you go to magic shows and hassle the magicians about how their tricks are fake, too?

Look ... as your god, I want you to know that I don't want you to go through this, I'm not sadistic ... it is all the fault of the jews.
If you allow me to guide you all in this battle of holy light vs satanic dark evil (jewry) then we can have permanent peace throughout the cosmos.
 
posts something wild and tin foil hatty
people respond naturally
"lol what?"
"hahahah y'all are just mindless NPCs I am v enlightened"

Reminds me of this image.
736424
 
That's why I believe in Minsky's philosophy that strong AI will be intentionally designed using hand-written algorithms instead of McCarthy's vision that it'll be just a big ol' neural network.
You're right, computers will never be able to simulate a human brain perfectly. That's why we need to work on their level. We don't simulate physics in games by literally simulating every atom in the universe, we simplify it into a set of algorithms that make practical sense for the hardware platform and for the application. I don't know why it's such a struggle for someone to try the same with AI. The brain is only a potential implementation of intelligence leveraging overly-complicated hardware for a lack of a software implementation. A software implementation would be so much simpler, kind of why we use tiny embedded processors running a RTOS instead of a giant room of analogue equipment now. Again, we don't simulate the physics of those analogue computers in embedded systems, we just write practically-equivalent code of what they were supposed to do.
It wouldn't be perfect, but, really just implement everything we know about psychology into code. It's not like we don't have a pretty solid understanding of how humans think and behave, just write the code to duplicate that.
I'd like to try it myself but I'm fairly busy. I mean I got theories but I don't have the time nor resources to try it now. Obviously it would kind of be a huge system. Really trying to not talk out of my ass after having said "where is the AI you keep writing philosophy about"
I think a handwritten implementation would pass the Turing test, but I don't think it'd be sentient.

For any handwritten algorithm, someone can read the source code and get a general idea of how it works and how it would react to a given stimulus. If you can predict it with specific knowledge of how it works, I think we can rule out sentience, at least sentience on a human level.

Well, and I'd also say that large parts of the human brain are specialized for certain tasks. But there's big parts that are a big question mark to science. It's general purpose brain power.

I think the key to sentience is having a lot of general purpose room for thoughts to lurch around in. You have space for optical processing, space for fight-or-flight, space for memories, and then there's lots of question mark areas in between the sections. I think that's where sentience lives.

And I also think that implementing general purpose gray matter like that does require special hardware. In fact, I don't think we can do it in silicon. We can get the theory down right, but it'll run way too slow (or with way too high latency; imagine having brain latency) for the simulated brain to interact with the real world in real time.

I think it's possible we could develop sentient AI through biological engineering. Maybe if we could clone blank neurons and "program" them somehow.
 
  • Like
Reactions: The Fool
The other day I went up up down down left right left right b a and then it basically maxed out my power ups.

Turns out my power ups kinda suck even maxed out.
 
I think the key to sentience is having a lot of general purpose room for thoughts to lurch around in. You have space for optical processing, space for fight-or-flight, space for memories, and then there's lots of question mark areas in between the sections. I think that's where sentience lives.

And I feel that's exactly where we need to start working. We need to challenge the unknown like we do with any scientific field.
Discussing what the brain doesn't do is too large and vague of a subject. But I will say one thing people don't seem to focus on at all is emotions, and we already have studies out the ass saying how emotions affect us in almost all ways mentally and even physically. We remeber pain more than pleasure, and if we remember something, suddenly we feel the emotion associated with it, even though that event isn't happening anymore. People seem so bent on getting a neural network to recognize images that they don't seem to worry about how those images even register to it. Is it interesting? Is it related to anything interesting? Should it be admired? Avoided? If it associates the image with something it likes, what happens when it comes into combination with something it doesn't like? How the brain registers data is so, so much more complex than just positive/negative reinforcement and I think a lot of people miss that. Emotional processes would fill at least some of the gaps you mention and it honestly wouldn't even be much more complicated than say, like, a couple floating-point values assigned to each memory of a subject it has.
You're right, there are many gaps in what we could currently design for an AI, but I think a lot of those gaps have pretty easy solutions if anyone ever bothered to care about them.
 
I just think we all live in an endless loop but with a different outcome everytime.
 
To save everyone the trouble of searching, Google tells me this is some kind of cartoon. He's literally getting his philosophy from a Disney ripoff :lol::lol::lol: that's the most basic npc shit imaginable.

It's a cartoon filled with too-deep-for-you visuals that waste like 22 minutes of the show and the other 3 minutes are lain going "but the internet IS real life!"
 
  • Like
Reactions: Wendy Carter
Honestly? I feel like we should start with insects. I think trying to shove a simulation of a fistful of neurons and expecting them to work together (even with a "lot" of training) is absurd.

Like, might as well graft some wires to some of my neurons and see of we can beam images into my head. Oh, I know they're not hooked up right, but don't worry, we'll run training on it and eventually this black box algorithm will pick out the right connections.

No, Moore's law was a very predictable and logical consequence of the fact that the first inventors of the transistor really didn't try as hard as they could've. Everyone knew it'd be an arms race that would last a long time.

But the nature of it, fitting more things onto a small area, inherently has a stopping point where you're squeezing the smallest possible things, atoms, together. (Or maybe quantum particles, but like I said, that just stretches out the runway a bit longer.)
I believe they've been able to fully simulate every part of an earthworm for some time now.
 
That's an insanely self centered point of view. None of us are that important and making the statement you're making indicates an inability to connect with others on your part long before your starring role on any Truman Shows.
Well, this person is a 15 year old female, so of course she's self centered.
 
@Hikikomori-Yume I'd suggest getting over yourself while you still can. You're not special, the Universe doesn't care for you or your feelings. You can escape reality and delude yourself into thinking you're the main character in all this, but reality will take you back eventually.
 
Last edited:
Back