Hacker Newsnew | past | comments | ask | show | jobs | submit | peetle's commentslogin

In my own experience, nano banana still has the tendency to:

- make massive, seemingly random edits to images - adjust image scale - make very fine grained but pervasive detail changes obvious in an image diff

For instance, I have found that nano-banana will sporadically add a (convincing) fireplace to a room or new garage behind a house. This happens even with explicit "ALL CAPS" instructions not to do so. This happens sporadically, even when the temperature is set to zero, and makes it impossible to build a reliable app.

Has anyone had a better experience?


The "ALL CAPS" part of your comment got me thinking. I imagine most llms understand subtle meanings of upper case text use depending on context. But, as I understand it, ALL CAPS text will tokenize differently than lower case text. Is that right? In that case, won't the upper case be harder to understand and follow for most models since it's less common in datasets?


There's more than enough ALL CAPS text in the corpus of the entire internet, and enough semantic context associated with it for it to be intended to be in the imperative voice.


Shouldn't all caps normalised to tokens like low caps? There are no separate tokens for all caps and low caps in Llama, or at least not in the past.


Looking at the tokenizer for the older Llama 2 model, the tokenizer has capital letters in it: https://huggingface.co/meta-llama/Llama-2-7b-hf


I work on the PixLab prompt based photo editor (https://editor.pixlab.io), and it follows exactly what you type with explicit CAPS.


The right answer for most people is to simply leave and pursue their ideals in a more favorable venue.


How could that more favorable venue be made, and have been made visible to him and to people like him?


The same thing has happened to me with political donations. Every day I receive an email from a different candidate. It is like whack a mole.


Great graphic on the environmental impacts of different kinds of grocery bags: https://ourworldindata.org/grapher/grocery-bag-environmental...

In short, single use plastic bags are very hard to beat.


Yes, but who wants to pay more for them? It's very hard to beat single use plastic bags for environmental impact: https://ourworldindata.org/grapher/grocery-bag-environmental...


Higharc | Senior Frontend Engineer, Senior SDET | Full-time | Durham, NC, San Francisco, CA, remote | higharc.com

Higharc has built a cloud platform for homebuilders (build >80% of homes built in the US every year) that massively reduces the burden of producing construction documents, material counts, and sales tools. At the center of the platform is an in-browser CAD system that allows builders to model homes with the information needed to support the complexity of their business. We raised a $53 million series B in 2023.

Technologies: Next.js, TypeScript, React

More details and application here: https://www.higharc.com/company/careers


Correct me if I'm wrong, but it appears that all the data about these "UFO"'s:

1. is conspicuously low quality, grainy 2. presents inconsistent imagery of the phenomenon 3. originates from sensor equipment most laymen are unfamiliar with and thus unable to know if they've been spoofed 4. presented in the context of clickbaity commentary


It's funny that you make this claim on a post with clear video of radar data: it's like the post already pre-emptively corrected you, but you said it anyway.

But yes in general I agree with you pm 1. I think the "grainy footage" status quo is an embarrassing debacle.


It always ends up being something completely explainable, like birds, planes, floating balloons, imaging artifacts, out-of-focus stars, what have you. Forgive me for not suspending my disbelief this time when every previous time disbelief has ended up winning the day.

It's well known that you can get all sorts of problematic radar artifacts that don't end up being real objects. Hell, the world was almost ended over some: https://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alar...


Crying out for help to be convinced. It's a sad sounding existence you related in that comment. Without hope. Defeated. ;p ;) xx

1997 phoenix lights, Italy 1960s Friendship case, Belgian UFO wave, Mexico City mass sighting...it doesn't "always turn out to be something explainable" at all


Radar displays will display whatever the underlying circuitry produces. Faulty circuitry will produce erroneous displays.

Faulty does not necessarily mean "broken", it could simply be that the software which is being run has many bugs and produces echoes on the display which are just not existent in reality.

As far as I can see it, if the UFOs are not secret new aircraft, then they are non-existent artifacts in the inadequate electronics.


That's a possibility and a fair point, but only if these data were not independently confirmed by other sensors, like night vision footage, iPhones, FLIR targeting pod footage, and pilots. You can't say it's faulty sensors if you get the same signal across multiple unrelated systems.

I'm not saying you're saying this specifically, but some people do, and that's crazy, it's a conspiracy theory to invoke the idea that all sensors failed at the same time in the same way, include hallucinating humans, it's very much reaching for a crazy explanation just to deny evidence or preserve a belief system challenged by that evidence.


it's a thermal vision scan; inherently grainy


Interesting, but they commissioned it. :) I'm curious if there are other assessments with different methodologies, different outcomes.


Is it even interesting though? How can it be trusted at all?

This kind of study is really easy to massage into the result you want. You're dealing with extremely variable, large-scale supply chains.


Agreed


Higharc | Senior/Junior Backend, Frontend, Graphics/CAD, Full-Stack, EM, VPE | US, REMOTE | https://higharc.com

We’re building a complete home building web platform. Our system, built with TypeScript, C++, and WebGL, produces permit-ready plan sets, bills of materials, and beautiful, realtime product configurators with realtime estimates/BOMs. We just raised a $21m Series A and are hiring across a wide range of engineering roles and seniorities.

We are a remote-first team (w/ offices in Durham, NC, SF, and Atlanta).

Roles:

- Frontend engineers (next.js, React, pure TypeScript)

- Backend engineers (node.js, PostgreSQL, terraform, AWS)

- Computational geometry, graphics, and CAD engineers (Unreal Engine, WebGL, three.js)

- Engineering managers (Modeling/CAD, Web)

Our careers page is at https://higharc.com/careers. Say you came from Hacker News.


remote globally?


This look exciting, but I am obsessed with the lack of subpixel rendering for fonts in godot (imgui, nanovg, etc). The fonts just look crummy to me.

As I understand it, subpixel rendering is a treacherous patent landscape. Is that the reason why it is consistently missing from these tools?


It's more because it's a fairly niche technique that requires either dual-source blending or constant blend colors, things that game rendering engineers don't think about often. It's also tricky to make work with transparency, which is the reason OS X ditched it. Basically, nobody in the video games space does it frequently.

Maybe I should clean up and open-source my subpixel text renderer that uses dual-source blending....


Subpixel text rendering also requires intimate knowledge of the display being rasterized for, and becomes a mess once you enter multiple displays with different pixel layouts. I was working on support in my font library https://github.com/mooman219/fontdue/, but it quickly becomes not worth it. I'm interested to see your approach to the problem if your project is online!


do it man! point the godot devs to it as well :D


FYI, there's no subpixel rendering in macOS anymore (since 10.15).


As far as I know all Microsoft patents expired in this area.

edit: seems like it wasn't too long ago.

https://www.freetype.org/patents.html


After you see really nice fonts, and get used to them, bad ones just drive you crazy. a 1440p or 4k monitor for instance, after having one you don't want to use a 1080p monitor for text any more!

I imagine for games its partly a performance issue that they don't do fancier font rendering sometimes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: