The bottleneck is nothing to do with money, it’s the fact that they’re using the empty neuron theory to try to mimic human consciousness and that’s not how it works. Just look up Microtubules and consciousness, and you’ll get a better idea for what I’m talking about.
These AI computers aren’t thinking, they are just repeating.
I don't think OpenAI cares about whether their AI is conscious, as long as it can solve problems. If they could make a Blindsight-style general intelligence where nobody is actually home, they'd jump right on it.
Conversely, a proof - or even evidence - that qualia-consciousness is necessary for intelligence, or that any sufficiently advanced intelligence is necessarily conscious through something like panpsychism, would make some serious waves in philosophy circles.
These AI computers aren’t thinking, they are just repeating.