Hacker Newsnew | past | comments | ask | show | jobs | submit | findalex's commentslogin

My measurement of quality going in was how far I need to scroll to see EKS. Very high quality.

The nature of the basic research beast. There are grad student written astrophysics/comp chemistry spaghetti codes that continue to get big funding for the sole reason (it feels like) that they scale huge and eat up DOE supercomputing time "look how fast (we burn money)". Maybe a hot take.


> Confusing. Inscrutible. But groundbreaking if we can pull it off.

You know what gets lift? Correct spelling (inscrutable)! Unless they chose that word specifically to misspell but that's meeting more than halfway.


This line alone made me want to read the rest of the page, so good work! Hilarious and self-deprecating is how I took it.


It tickled me too - and TBH I got sidetracked by the misspelling when my browser pointed it out in the HN comment box :)


Right - e.g., if you're modeling a physical system it makes sense to bake in some physics - like symmetry.


Indeed, and I think natural language and reasoning will have some kind of geometric properties as well. Attention is just a sledgehammer that lets us brute force our way around not understanding that structure well. I think the next step change in AI/LLM abilities will be exploiting this geometry somehow [1,2].

[1] GrokAlign: Geometric Characterisation and Acceleration of Grokking, https://arxiv.org/abs/2510.09782

[2] The Geometry of Reasoning: Flowing Logics in Representation Space, https://arxiv.org/abs/2506.12284


QM would tell us the order of your Hamiltonian (attention operator) doesn’t limit the complexity of the wave function (hidden state). It might be more efficient to explicitly correlate certain many-body interactions, but pair-wise interactions, depth and a basis (hidden state dimension) approaching completeness "are all you need”.


The terminology is overloaded.. Tensors in QM are objects obeying transformation laws, in ML Tensors are just data arranged in multidimensional arrays. There are no constraints on how the data transforms.


Intended as analogy - but it is essentially a description of the DMRG algorithm (quantum chem). Only pair-wise operators there but the theory approaches exact when there are enough terms in your tensor product (iterations ~ depth) and a large enough embedding dimension.

> There are no constraints on how the data transforms.

Except those implicit in your learned representation. And that representation could be the MB WF.


ugh.. i <3 emacs. Thanks for this tidbit.


I enjoyed this comment on many levels.


and pipx.


How boxed in is cosmology by the cosmological principle? If we - as an example - didn't assume the cosmological constant was constant and expected it to vary over large distances, could we arrive at a working model of the universe? maybe high density dark matter/energy regions are the same as regions of high/low values of the CC. It's late.

edit; did some digging - looks like its actually and active area: https://www.scirp.org/journal/paperinformation?paperid=13446...


...must function unfastened from current PsyOps doctrine which prohibits communicating any PsyOps type message or meme to US domestic audiences.

PsyOps doctrine is an interesting way of saying "the law"... at least until NDAA 2012: https://www.congress.gov/bill/112th-congress/house-bill/5736.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: