Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

    In fact, most software isn't security critical, at all. If you are writing software which is security critical, then I can understand this confusion; but you have to remember that most people don't.
No one knows what software will be security critical when it's written. We usually only find out after it's already too late.

Language maintainers have no idea what code will be written. The people writing libraries have no idea how their library will be used. The application developers often don't realize the security implications of their choices. Operating systems don't know much about what they're managing. Users may not even realize what software they're running at all, let alone the many differing assumptions about threat model implicitly encoded into different parts of the stack.

Decades of trying to limit the complexity of writing "security critical code" only to the components that are security critical has resulted in an ecosystem where virtually nothing that is security critical actually meets that bar. Take libxml2 as an example.

FWIW, I disagree with the position in the article that fail-stop is the best solution in general, but there's experimental evidence to support it at least. The industry has tried many different approaches to these problems in the past. We should use the lessons of that history.



> The people writing libraries have no idea how their library will be used.

Unless you're paying them, the people writing the libraries have no obligation to care. The real issue is Big Tech built itself on the backs of volunteer labor and expects that labor to provide enterprise-grade security guarantees. That's entitled and wholly unreasonable.

> Take libxml2 as an example.

libxml2 is an excellent example. I recommend you read what its maintainer has to say [1].

[1] https://gitlab.gnome.org/GNOME/libxml2/-/issues/913#note_243...


That's part of my point. As Nick says, libxml2 was not designed with security in mind and he has no control over how people use it. Yet in the "security only in the critical components" mindset, he's responsible for bearing the costs of security-critical development entirely on his own since daniel left. That sucks.

But this isn't a conversation limited to the big tech parasitism Nick is talking about. A quick check on my FOSS system implicates the text editor, the system monitor, the office suite, the windowing system, the photo editor, flatpak, the IDEs, the internationalization, a few daemons, etc as all depending on libxml2 and its nonexistent security.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: