One important component is adding significant damping, corresponding to a small probability of jumping to a random page. This makes sure that the graph is well connected and the power iteration converges fast.
I just did this exact thing today, writing a cloth simulator! I wanted to try updating each node position in turn using a Newton step, holding the other nodes fixed.
If it is "undecidable" this means there is no counterexample to the Collatz conjecture, since any counterexample would disprove it. But the Collatz conjecture does exactly state that there are no counterexamples. Which means: If it is undecidable, it is true.
Which seems a bit paradoxical. If you can prove that the Collatz conjecture is undecidable, you would also prove that it has no counterexamples, and thus that it is true. Which would make it decidable -- contradiction. So this seems to prove that if the Collatz conjecture is undecidable, this fact is itself also undecidable.
That is the case for something like Goldbach's Conjecture, which says that every even number > 2 is the sum of two primes. If it's false, then there is a counterexample, and it is easy to prove whether or not a given number is a counterexample (just loop over all pairs of smaller primes).
But that is not the case for the Collatz Conjecture. A Collatz counterexample could be a number whose orbit loops back around. That would be a provable counterexample. Another kind of Collatz counterexample would be a number whose orbit never terminates or repeats, it just keeps going forever. If such an infinite sequence existed, it might not be possible to prove that it's infinite. And if it isn't provable, then the conjecture would both undecidable and false.
I would say the second law is more relevant; if you want to reduce entropy in one part of a system, you have to expend an equivalent or greater amount of free energy elsewhere.
Both the Bayesian perspective and the optimization perspectice are legitimate ways of understanding the Kalman filter. I like the Bayesian perspective better.
Forgive me, I'm thoroughly confused by that dichotomy. How are they different? Approaching from bayes rule or a "maximum likelihood" approach produces the same results.
The result is identical, the understanding is different. I would suggest that the Bayesian perspective leads to insights like the UKF [1] which IME is all round much better than the apparently better known EKF for approximating non linear systems.
[1] That is, it is generally easier to approximate a distribution than a non linear function.
I think you need to recalibrate your perceptions; the world (including the US) is about as safe as it's ever been. And I don't think hypocritical is the right choice of word anyway.
And with sensible planning like overcooling during day-time. It could actually utilize the production effectively. Too hot nights are still relatively rare.
Also, the parts of science that affect our everyday lives come together to form a coherent picture which explains many different observations. Religions don't have anything like that.