Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This would have been awesome for me 2 years ago.

Currently much of my complicated SQL is generated by a LLM.



Hmm, I would think that LLM helps adoption for the semantic layers such as PRQL, Malloy, and dbt since it's possible to generate/validate/iterate 5 lines of PRQL compared to 25 lines of SQL but considering none of them widely adopted yet, you might indeed be correct in a way that LLM makes it harder for the new tools to gain adoption by helping you to suffer less from the verboseness of SQL.


It’s a tough call. I run a small analytics team and am starting to train some analysts to code. Just the other day I basically told one of my reports to focus on learning Python and let ChatGPT teach him SQL by example because I think it’ll be easier to grok the explanations. Now I’m looking at PRQL and Malloy and asking myself if it’s really a path I should send them down, and I’m not sure it’s a good idea.


I just tried ChatGPT to generate some Malloy snippets and compared to SQL, it’s very basic. It’s probably not a huge lift to teach it the library by scanning the docs but still the reasoning with SQL is much sophisticated given that there are tons of training data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: