Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"we have been open-source long before it was fashionable"

An abridged timeline:

1960s to 1980s: hobbyist and academic/research computing create thriving public domain software ecosystems (literally the birth of FOSS)

1983: The GNU Project begins

1989: The World Wide Web is created

1991: Linus Torvalds posts the first Linux kernel to USENET

1992: 386BSD is released; Slackware is created

1993: NetBSD is forked; Debian is created

1994: FreeBSD 2 is released

1995: Red Hat is created

[a decade of FOSS and the internet changing computing and research forever]

2005: A collection of low-cost microcontroller education tools, benefiting from half a century of FOSS, is formalized into something called "Arduino"



Also, didn't early Arduino heavily borrow from another open-source project, "Processing"?

Processing was/is graphics-centered, so that's where Arduino's term "sketch" come from, if you ever wondered.

https://en.wikipedia.org/wiki/File:Processing_screen_shot.pn...

https://en.wikipedia.org/wiki/File:Arduino_IDE_-_Blink.png


"Wiring", which constitutes Arduino's primary API surface, was taken wholesale from Hernando Barragán's 2003 master's thesis project. It was a fork of processing for microcontrollers and was not written by the Arduino team: Massimo Banzi, David Cuartielles, David Mellis, Gianluca Martino, and Tom Igo.


I have to dig around, I think I still have one of the original wiring boards from around 2006 (maybe)?


Yeah, the software side is basically only an IDE, a build system and a package manager an another system API (basically an alternative to libc). Which is useful for C++, but far from being non-replaceable.


Embedded in particular was mired in closed source vendor lock in toolchains and SDKs for decades.

And some industry players still are. Looking at you, Broadcom and Qualcomm.


I recall AVR-GCC not only working just fine in 2005 but being the official method for compiling code for those chips. I used it before Arduino came out to target the same chips.

Arduino was a nice beginner friendly IDE for sure that eliminated the need for make files or reading documentation for GCC, but the existing ecosystem was definitely not closed source.


Maybe it was different in 2005, but now Arduino is just an IDE, that uses GCC under the hood. So it is still "the official method for compiling code for those chips".


Definitely, it's always used GCC under the hood, and also just been the IDE.

Arduino (the original AVR boards anyway) have always relied on GCC, and not just that but the entire open source chain that already existed for AVR-GCC. I'm sure they contribute back (I guess "sure" is an exaggeration), but it worked pretty darn well already.

Arduino, for me, replaced emacs for an IDE. The main reasons I use it are because I don't need to write a makefile, and the integrated serial port. Those are good enough features that I still use the IDE even though I haven't touched a real Arduino in a decade or more. But I work alone and don't usually have more than a few thousand lines of code so it's not too complex to manage.


That tracks with my experience, since I don't prefer the IDE, because I need a build system that generates unique IDs and conditionally compiles stuff and handles debug flags. Also it needs to be reproducible and run unattended. In addition a build system is way faster. Arduino almost spends half of the build time on reanalyzing the project files to find out, what it actually needs to do and then it creates build artifacts in a random location. That's all problems you don't have with a proper build system, so the build system ends up way faster. I think Arduino also doesn't support parallel compilation.


  By the mid 2000s open-source hardware again became a hub of activity due to the emergence of several major open-source hardware projects and companies, such as OpenCores, RepRap (3D printing), Arduino, Adafruit, SparkFun, and Open Source Ecology. In 2007, Perens reactivated the openhardware.org website, but it is currently (February 2025) inactive. [0]
I think they should have worded this better, but what they are known for, more specifically, is pushing open source hardware forward and sticking with it on principle even though it caused many business challenges.

[0]: https://en.wikipedia.org/wiki/Open-source_hardware


Arduino doesn't directly benefit from pretty much any of legacy unix barf-bag stuff.

It's just a HAL and an IDE, with a truckload of user/third party supplied libraries for various modules, sensors, etc.

Plus, every sizable MCU/dev-board vendor supplies a Arduino HAL implementation (so called Core) for their board/mcu/module (or it's done by enthusiastic community).


They derive a massive benefit from GCC even if they use nothing else at all.

The point of the GP was to refute the claim that they were started “before open source was cool”.


This isn't entirely true.

It's Atmel that derives massive benefit from GCC, or whoever implemented AVR backend for GCC.

Arduino doesn't - strictly speaking - depend on GCC, it could (and does) use any toolchain that is supplied by MCU vendor.

And it just happens so that many MCU vendors do often use GCC as part of their toolchain. Arduino just bundles that with vendor supplied tools for flashing, etc, like avrdude.

Which is to say - it's the MCU vendors that derive the main benefit from GCC.

Arduino will just happily use whatever toolchain MCU vendors provide.


> 1989: Tim Berners-Lee invents the World Wide Web

I think ideas etc... existed before that, e. g. DARPA and what Alan Kay said.

Tim mostly pushed forward a simple protocol that worked. Would be interesting to see how much Tim really generated de-novo, but in general I disagree that he "invented" the world wide web as such. That would seem unfair to many other people - just like Alan Kay once said, you see further by standing on the shoulders of giants (translation: you benefitted from earlier inventions and ideas, made by other people).


As I was writing it out, I knew someone was going to complain.

It's an abridged timeline. Brevity because the point is the date, not the fine detail.

But since I don't care to argue on the internet... edited.


There's always someone who has to be pedantic


> Would be interesting to see how much Tim really generated de-novo, but in general I disagree that he "invented" the world wide web as such.

Eh? What do you mean it would be interesting to see? It's well-documented. Not controversial or hidden.

The HTTP protocol yes. But also the browser/editor app, WorldWideWeb, a web server for it, and the URL scheme, are literal Berners-Lee inventions. HTML may be an SGML language but it's his SGML language.

He's not claiming and nobody is claiming he invented hypertext (he would say Ted Nelson and Alan Kay).

He absolutely invented the fundamentals of the end-to-end web technology as we use it. There was no functioning internet open-hypermedia system before 1990. It's just not in question and it's kind of disingenuous to imply he didn't do much.

(Defining down "invent" in this way is also disingenuous to all inventors, who all do their work in the context of prior art)


Complained about and already modified. However, what is "wrong" is that some "person" invented the internet. We live in a time of "followers" and in that paradigm we need some singular person to follow. But it was actually a bunch of original thinkers of whom TBL was one. But it was not a person. I suspect a closer answer is the IETF, but that is also a leaky abstraction.

The point is that if you want to do something, you are probably more likely to do it well with lots of other doers. Not followers.


The World Wide Web is not the Internet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: