Engines, planes, PVs. They all gradually became better in a way which doesn’t really map to the organic function they augment, replace or take inspiration from.
Planes don’t fly like birds. There is very little reason for what you would call a “general intelligence” ai to develop in a way which mimics our own intelligence. It might but I would find that more notable that if it did not.
I think the biggest difference between building a general intelligence and other engineering problems is that in the latter, the target function is well-defined. With planes, anything that is able to fly safely goes. Nuclear reactors, anything that generates electricity. But with intelligence... What does it need to do, exactly? The 'moving goalpost' is moving because we have no idea where it should be. We're building AI blindfolded. I don't think we're currently solving the problem, simply because we don't know what the problem is.