I have read somewhere that an experiencing a supernova at sun distance would be the same as holding a hydrogen bomb to your eyeball. The energy released in these events is basically unimaginable.
It depends on the kind of supernova. Type Ia[1] is really insane. 10^44 J is a thing, that I think can blind you, even you've chosen a spot for your picnic to watch a Big Boom at distance of 1 parsec. A white dwarf made mostly of carbon burns all the carbon into oxygen in matter of seconds, and then it burns some of oxygen that was a result of burning carbon. It would like to continue brewing more and more heavy elements, but can't, because it becomes so hot, that gravity is no longer enough to keep the matter from flying away.
All the stars in the universe, burning as brightly as they are, are the tiniest fraction of additional energy compared to the 2.73°K background temperature of space. The Big Bang was very warm.
For certain values of safe. It’s close enough to strip the ozone layer, significantly increase the risk of cancer, alter the climate, and possibly cause extinctions.
At these scales, several orders of magnitude literally makes no difference.
Hydrogen bomb yields range from roughly 0.1 MT to 100 MT (the full design yield of the Tsar Bomba), or four orders of magnitude. They can be considered equivalent for the purposes of this comparison. The principle warhead of the US ICBM force, the W87 warhead, has yield of ~0.3 to 0.475 MT.
Even at a distance of several tens of metres from your eye, destructive effects would remain significant.