Solar gets lots of love. But when planet-based its down 50% of the time - at night. Not very useful for life support.
I don't know why orbital solar isn't discussed more. Folks cite efficiency losses when transmitting to ground, but does it rise to 50% like ground-based?
I think you fundamentally misunderstand what’s needed. It’s more of a science fiction cliche that life support needs immense amounts of power. The earths life support is based on solar power and that has worked out just fine for humanity. Ideally life support should be as passive as possible, and using solar power is a great way to achieve that.
PS: Orbital solar is an issue due to launch costs, and the fact most electricity is used in the daytime. Solar + battery is also getting very cheap.
Understand that a colony on the Moon isn't there just as a proof of concept. Its there to do something, probably industrial. Running billion-dollar industrial processes only when the sun is shining, is clearly a downside.
If your talking scale that’s a question of economics.
Batteries + solar are already cheaper than new nuclear power on Earth over 24 hours. For Mars the reduced sunlight is an issue, but nuclear also faces major issues without ready water and very low atmospheric pressure. Low ground temperatures allow you to dissipate heat, but construction costs would be dramatically higher.
The moon is different due to extended day night cycles. Which also make it extremely unappealing for long term colonization. Trying build a reactor that still operates when ground temperatures hit 127 degrees Celsius is again very difficult.
Out past Mars solar is a poor fit. But, without something like cheap fusion power it’s extremely unlikely giant colonies would be viable anyway.
Oh! Mars may not be so poor for solar. Inverse-square means less sunlight per area, so less electricity. But thin atmosphere factors in too! Estimated Earth-based solar loses 90% by the time sunlight gets to the ground.
So in fact a solar panel in space around the asteroid belt (outside the orbit of Mars) gets about the same flux per square as Earth ground-based. If I estimated right.
Anyway, talking space-based solar vs ground-based solar (not nuclear). Due to the twin advantages of less atmosphere to get thru, and no/less night.
90% is wildly off base. In earths orbit you get ~1361 W/m2 and on the surface you can hit 1050W/m2 of direct sunlight and up to ~1120 W/m2 when including radiation scattered or reemitted by atmosphere and surroundings. https://en.wikipedia.org/wiki/Solar_irradiance
Looking at averages is misleading. Clouds and the atmosphere on average reduce this significantly, but that’s very location specific. Near the poles the sun stays at very low angles 24/7 which significantly increases the average absorption. But, solar is a poor fit at the poles anyway.
Ok, don't know where I got that. Looking again, I see an average of a halving of total solar flux from space to sea level.
Mars is 144M miles from the sun, average. The earth is ~100M. That's a difference of about 1.44:1 which puts the inverse-square reduction due to different distances at 2:1.
What does that mean? Mars solar flux is about the same as Earth at sea level. So expect similar solar panel efficiencies.
But do they weigh any less? With solar in orbit, storage requirements may be far smaller. And batteries have got to be the biggest weight component in the system.
I don't know why orbital solar isn't discussed more. Folks cite efficiency losses when transmitting to ground, but does it rise to 50% like ground-based?