@Humpleupagus @icedquinn @miscbrains > I do believe that at a very fundamental level, Materialism can't explain consciousness.
That question isn't particularly meaningful without a definition of consciousness.
> and the brain is akin to an antenna that interfaces that medium.
There's a famous theory along these lines; the guy that made ZTD made a bit about it, but it's an old one, and I don't remember the name of the theory.
> I also assume that physically, the system requires synaptic isolation and cross-talk / bleed. The fact it creates / projects one unified experience tends to suggest cross-talk / convergence at some level.
I think that this is an involuntary process that occurs in the brain via communication. You slow your pace in the grocery store because they put on a slow song: the speaker isn't conscious, but we use music to communicate and the people that were able to ignore it are dead. Same with someone saying something to you that you can't forget, it keeps ringing in your ears while you try to sleep. That happens to everyone, meaning the people it doesn't happen to are dead, meaning it's probably important to our survival.
> So I'm not convinced that more cores would necessarily create consciousness,
Well, create a thing of comparable intelligence to a human. But like I said earlier, even that many cores wouldn't be comparable. In terms of "consciousness" (whatever that means), I don't have anything to say. As noted upthread, I think the system is too complex to simulate: we'd have to hollow out the moon to build something that could do it. We can enumerate all of the neurons in C. elegans and it is still incredibly computationally expensive to simulate that worm in a very uniform environment:
https://github.com/openworm/OpenWorm . It can't be done in real-time on a desktop machine, and we're just simulating a handful of neurons and simulating the body the same way a game does, not with molecules but with a lot of hand-waving and simplification.
So we have these large language models and big companies with a lot of money are throwing around unprecedented amounts of it on a chatbot: this doesn't come close. It's an algorithm with a vast trove of data, it's not close to an entire brain. It's not even comparable.
> though it could probably imitate it,
What is the difference between consciousness and the imitation of consciousness?