Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

you don't buy it? This is foundational. This isn't something I'm making up. It's the formal definition of entropy.

https://www.labxchange.org/library/items/lb:LabXchange:ac117....



Thanks for linking that. My point really was that entropy is many things and probability is among them.


I think his point is that it is not clear why every microstate is equally probable.


It's an assumption we make due to imperfect knowledge of a system. We assume all microstates have equal probability.

Just like rolling 6 dice. We assume all configurations of the 6 dice have equal probability.

What's the probability of rolling exactly 1,1,2,6,3,5? It's the exact same probability as rolling 6,6,6,6,6,6. We instinctively assign these probabilities based of of assumption because we also assume each number has a 1/6 probability so rolling exactly a certain number for each roll yields (1/6)^6

It's the macrostates that are subjectively grouping these microstates in various ways. If I pick the macrostate where all dice are the same there's only 6 microstates for that. If I make up the macrostate for at least one dice is different that's every possible microstate except 6 microstates where they are all the same.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: