Yes, I am an old boring fart that isn't cloud native. And this post is going to offend a lot of people. I am sorry for that, but I still believe this is an important point to make:
From my perspective, the root issue is that people are using the wrong tools for their projects, namely the wrong programming language.
The problem started when people began abusing scripting languages for something else than scripting.
Python was meant as a teaching language for kids. JavaScript was meant to for some gimmicks on Websites. .net was meant for UI applications. PHP stood for "Personal Home Page Tools".
But somehow people started using these tools for something completely different than what they were meant for.
Due to people abusing those languages to write server backends, it suddenly became a problem of those languages creating code that's 1000 to 75.000 times slower than native code.
That then brought the need for clusters, load-balancers, etc. And that the need for tools to manage those. Then they needed tools to manage those tools. And tools to manage the people who manage those tools. And every year I look at the industry, another layer of complexity is needed.
So, the author writes: "Our platform back then was 50% .Net and 50% Python" - and here is the ACTUALY learnings he should have taken away:
"Eight years ago, after our 'developers' had drafted our products in scripting languages, we hired a senior C/C++/Pascal/Rust programmer. That developer re-wrote our drafts in a clean way. He used a profiler and checked for memory leaks, and did some optimizations. Afterwards we bought three servers in two different data centers with two upstream providers each. Since then we had zero downtime, our servers are only consuming 800 Watts in total, our operating costs are minimal, and we are contributing towards having a greener planet. Looking at our company growth rate, those six servers will be good for another 8 years."
Think that's hyperbole? No, it's not. If your interpreted scripting language is 1000x slower than a compiled one, you'll simply need 1000x the resources. You are burning our planet and your money just because you weren't able to accept the fact that it's OK to use a scripting language for quick hacks and... scripting, but that it's the wrong tool for high workloads.
So, for your next project, please consult this checklist:
[ ] Is the project about teaching kids how to program?
[ ] Is the project about adding a blinking button to your website?
[ ] Is the project about running a desktop application on a Windows PC?
[ ] Is the project about doing your personal home page?
In case you have not checked any of the above boxes, you might want to consider having your code re-written in native code, avoiding 90% of your management layers and dependency hell.
And as a bonus, with the CO2 you have saved the planet you are able to buy another two SUVs! ;)
"Eight years ago, our engineers drafted our products in scripting languages in frameworks like Django or Rails, and we had a product up and running in a few months"
vs
"Eight years ago, our engineers decided to settle on Rust, and six years ago we went out of business, because it was much harder to write web services quickly that met what customers actually wanted"
From my perspective, the root issue is that people are using the wrong tools for their projects, namely the wrong programming language.
The problem started when people began abusing scripting languages for something else than scripting.
Python was meant as a teaching language for kids. JavaScript was meant to for some gimmicks on Websites. .net was meant for UI applications. PHP stood for "Personal Home Page Tools".
But somehow people started using these tools for something completely different than what they were meant for.
Due to people abusing those languages to write server backends, it suddenly became a problem of those languages creating code that's 1000 to 75.000 times slower than native code.
That then brought the need for clusters, load-balancers, etc. And that the need for tools to manage those. Then they needed tools to manage those tools. And tools to manage the people who manage those tools. And every year I look at the industry, another layer of complexity is needed.
So, the author writes: "Our platform back then was 50% .Net and 50% Python" - and here is the ACTUALY learnings he should have taken away:
"Eight years ago, after our 'developers' had drafted our products in scripting languages, we hired a senior C/C++/Pascal/Rust programmer. That developer re-wrote our drafts in a clean way. He used a profiler and checked for memory leaks, and did some optimizations. Afterwards we bought three servers in two different data centers with two upstream providers each. Since then we had zero downtime, our servers are only consuming 800 Watts in total, our operating costs are minimal, and we are contributing towards having a greener planet. Looking at our company growth rate, those six servers will be good for another 8 years."
Think that's hyperbole? No, it's not. If your interpreted scripting language is 1000x slower than a compiled one, you'll simply need 1000x the resources. You are burning our planet and your money just because you weren't able to accept the fact that it's OK to use a scripting language for quick hacks and... scripting, but that it's the wrong tool for high workloads.
So, for your next project, please consult this checklist:
[ ] Is the project about teaching kids how to program? [ ] Is the project about adding a blinking button to your website? [ ] Is the project about running a desktop application on a Windows PC? [ ] Is the project about doing your personal home page?
In case you have not checked any of the above boxes, you might want to consider having your code re-written in native code, avoiding 90% of your management layers and dependency hell.
And as a bonus, with the CO2 you have saved the planet you are able to buy another two SUVs! ;)