- Mike Towndrow/Indie Game Awards Retraction announcement video [2]
- IGA FAQ Game Eligibility, info on retraction [3]
"Initially discovered through itch.io’s Game Boy Competition 2023 and later played on cart, Gortyn Code was selected as an Indie Vanguard due to their impressive work in GB Studio and for crafting such an amazing throwback for the modern day. The physical cart of Chantey is being produced and sold by ModRetro, and it is the sole marketplace where it can be purchased. The IGAs nomination committee were unfortunately made aware of ModRetro’s vile nature the day after the 2025 premiere with the news of their horrid and disgusting handheld console. As the company strictly goes against the values of the IGAs, and due to the ties with ModRetro, the Indie Vanguard recognition has also been retracted.
The decision does not reflect Gortyn Code, but ModRetro alone. Chantey remains a wonderful throwback to the Game Boy era. We encourage you to continue following their journey on itch.io."
- ModRetro's Anduril Edition Chromatic, all proceeds go to support veteran suicide prevention [4][5]
These are two articles I liked that are referenced in the Python ImageHash library on PyPi, second article is a follow-up to the first.
Here's paraphrased steps/result from first article for hashing an image:
1. Reduce size. The fastest way to remove high frequencies and detail is to shrink the image. In this case, shrink it to 8x8 so that there are 64 total pixels.
2. Reduce color. The tiny 8x8 picture is converted to a grayscale. This changes the hash from 64 pixels (64 red, 64 green, and 64 blue) to 64 total colors.
3. Average the colors. Compute the mean value of the 64 colors.
4. Compute the bits. Each bit is simply set based on whether the color value is above or below the mean.
5. Construct the hash. Set the 64 bits into a 64-bit integer. The order does not matter, just as long as you are consistent.
The resulting hash won't change if the image is scaled or the aspect ratio changes. Increasing or decreasing the brightness or contrast, or even altering the colors won't dramatically change the hash value.
"Keep in mind that the 5th Congress did not really need to struggle over the intentions of the drafters of the Constitutions in creating this Act as many of its members were the drafters of the Constitution."
"Clearly, the nation's founders serving in the 5th Congress, and there were many of them, believed that mandated health insurance coverage was permitted within the limits established by our Constitution."
This seems like a fallacy of composition, and done so to try and persuade the reader. By my rough count, just 6 of the original founders that signed the Constitution were still in Congress at this time, or just 18% of the signers[1]. There's no roll call vote that I can find, but signer Charles Pinckney had voiced general oppositions and thought "it only reasonable and equitable that these persons pay for the benefit for which they were themselves to receive, and it would be neither just nor fair for other persons to pay it"[2]
"And when the Bill came to the desk of President John Adams for signature, I think it’s safe to assume that the man in that chair had a pretty good grasp on what the framers had in mind."
This just points to the same argument that's always being made between Spirit vs Letter of the law proponents, ~4% of Congress during the 5th Congress were signers of the Constitution and we don't know how they even voted on this. So ~96% of Congress were basically in the Spirit vs Letter dispute that we're in today.
This is awesome, congratulations. I'm glad to see some text-to-sql models being created. Shameless plug: I also just realized you used NSText2SQL[1] which itself contains my text-to-sql dataset, sql-create-context[2], so I'm honored. I used sqlglot pretty heavily on it as well.
Do you think a 3B model might also be in the future, or something small enough that can be loaded up in Transformers.js?
I also think this is the route we are heading, a few 1-7B or 14B param models that are very good at their tasks, stitched together with a model that's very good at delegating. Huggingface has Transformers Agents which "provides a natural language API on top of transformers: we define a set of curated tools and design an agent to interpret natural language and to use these tools"
Some of the tools it already has are:
Document question answering: given a document (such as a PDF) in image format, answer a question on this document (Donut)
Text question answering: given a long text and a question, answer the question in the text (Flan-T5)
Unconditional image captioning: Caption the image! (BLIP)
Image question answering: given an image, answer a question on this image (VILT)
Image segmentation: given an image and a prompt, output the segmentation mask of that prompt (CLIPSeg)
Speech to text: given an audio recording of a person talking, transcribe the speech into text (Whisper)
Text to speech: convert text to speech (SpeechT5)
Zero-shot text classification: given a text and a list of labels, identify to which label the text corresponds the most (BART)
Text summarization: summarize a long text in one or a few sentences (BART)
Translation: translate the text into a given language (NLLB)
Text downloader: to download a text from a web URL
Text to image: generate an image according to a prompt, leveraging stable diffusion
Image transformation: modify an image given an initial image and a prompt, leveraging instruct pix2pix stable diffusion
Text to video: generate a small video according to a prompt, leveraging damo-vilab
It's written in a way that allows the addition of custom tools so you can add use cases or swap models in and out.
I like the analogy to a router and local Mixture of Experts; that's basically how I see things going, as well. (Also, agreed that Huggingface has really gone far in making it possible to build such systems across many models.)
There's also another related sense for which we want routing across models for efficiency reasons in the local setting, even for tasks for the same input modalities:
First, attempt prediction on small(er) models, and if the constrained output is not sufficiently high probability (with highest calibration reliability), route to progressively larger models. If the process is exhausted, kick it to a human for further adjudication/checking.
This shows up as zero for me but the badge site says I've used it twice. It's pretty easy to manually check since I haven't made too many comments. I suspect the badge site is fuzzy matching since "luck" or some variation has appeared twice (now three times)
A reason I like it is I have an "older" AMD GPU which is no longer supported by ROCm (sort of AMDs version of Cuda) which means running locally I'm either trying to figure out older ROCm builds to use my GPU and running into dependency issues or using my CPU which isn't that great either. But with WebGPU I'm able to run these models on my GPU which has been much faster than using the .cpp builds.
Its also fairly easy to route a Flask server to these models with websockets, so with that I've been able to run python and pass data to the model to run on the GPU and pass the response back to the program. Again, there's probably a better way but its cool to have my own personal API for a LLM.
- Gortyn Code "Thanks" speech video [1]
- Mike Towndrow/Indie Game Awards Retraction announcement video [2]
- IGA FAQ Game Eligibility, info on retraction [3]
"Initially discovered through itch.io’s Game Boy Competition 2023 and later played on cart, Gortyn Code was selected as an Indie Vanguard due to their impressive work in GB Studio and for crafting such an amazing throwback for the modern day. The physical cart of Chantey is being produced and sold by ModRetro, and it is the sole marketplace where it can be purchased. The IGAs nomination committee were unfortunately made aware of ModRetro’s vile nature the day after the 2025 premiere with the news of their horrid and disgusting handheld console. As the company strictly goes against the values of the IGAs, and due to the ties with ModRetro, the Indie Vanguard recognition has also been retracted.
The decision does not reflect Gortyn Code, but ModRetro alone. Chantey remains a wonderful throwback to the Game Boy era. We encourage you to continue following their journey on itch.io."
- ModRetro's Anduril Edition Chromatic, all proceeds go to support veteran suicide prevention [4][5]
---
[1] https://www.twitch.tv/videos/2647339751?t=0h45m15s
[2] https://bsky.app/profile/indiegameawards.gg/post/3magufzccy2...
[3] https://www.indiegameawards.gg/faq#:~:text=Initially%20disco...
[4] https://modretro.com/products/anduril-chromatic-porta-pro-bu...
[5] https://x.com/PalmerLuckey/status/2002221958700872045