Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how this can be used with an LLM to provide interesting tax advice? I'd love to regularly ask questions of the tax code...


patio11's already saved over $2k apparently, maybe he'll do a more formal write-up at some point. (A couple threads here https://x.com/patio11/status/1977425626584711668 and here https://x.com/patio11/status/1978168404793037087 )


I am quite likely to do a more formal writeup in the next few weeks, unless Zvi beats me to it. (He had, apparently, directionally similar results.)


Any idea what the actual deduction it supposedly found for private school?

You can pay for K-12 with 529 or Coverdell ESA funds. But neither allows deductions for contributions. Only growth in either is tax free (assuming it’s spent on education expenses).


Many states allow a state tax deduction for 529 contributions, which could net you up to an 8ish% discount if you’re in a high tax locality (e.g. NYC).



Okay I was thinking only of federal taxes (for which there’s no deduction). Thanks for the detail.


I've also saved a bit of money on taxes just by thinking about possible deductions and asking LLMs whether they exist. Of course to actually claim such deductions I need to follow instructions from the IRS/state tax agencies so it's hallucination proof: I'm still manually reading the instructions from the tax agencies to understand how to claim them.


I guess as long as it's for entertainment purposes only. I'm going to file "actually following tax/legal advice from a potentially hallucinating LLM" under NOPE.


The super obvious workflow is to query for an idea in natural English and then verify or ask the LLM to provide the paths it was following.

It begs the question why you assume the parent comment was going to blindly follow the LLMs output.


> It begs the question why you assume the parent comment was going to blindly follow the LLMs output.

Many people do


Makes me wonder if someone has already trained a model on the tax code. Would be interesting for sure.


Model training data already contains all the text there is[0], so they can already answer questions like this (especially with web search), but they aren't good at tax calculations.

https://arxiv.org/abs/2507.16126v1

[0] but it's quite possible the conversion from HTML to text is bad


The problem is that the text of US tax code isn't enough to know the correct action to take. The IRS has semi-formal policies based on how it has chosen to interpret the statutes. There are areas of gray that they don't clearly specify. Some of this is in supplementary publications but it still has subjective elements. One example is that settlements for "serious injuries" are regarded as non-taxable income. What constitutes serious is a squishy concept.


Yeah you'd have to pull in a lot of case law and perform a lot of fine tuning on expert tax advice (you'd probably have to create this training data).

Would be neat (and still legally fraught!).


You can technically use the language model as a data model. That was the quick hack that started it all, autocomplete on a question produces the answer, yes.

However it's clear that we are moving towards separating the data and the language model. Even base chatgpt is given Search Tools and python Tools instead of producing them by text, the tool call itself may be generated by the model though.

You can for sure use a pure LLM to ask it questions about tax code, but we'll probably see specific tools that only contain canon law and kosher case law, and sources it properly. Y'know instead of halucinating




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: