Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: If you only do programming at work, how do you manage your career?
30 points by amazonavocado on April 7, 2020 | hide | past | favorite | 22 comments
People who say they don't do programming outside of work hours, I'm kind of surprised at them. I was thinking that I needed to in order to stay on top with marketable skills. But if you don't make software on the side outside of work hours, how do you manage your career in programming so you don't fall off the rails?

Were you just lucky enough to get into the right jobs that keep you doing work that is highly in demand for the moment? Looking back, I kinda wish I started my career with a slow-moving Java or .NET enterprise work because, although not being very sexy, it is comparatively stable to front end web development.



You kind of hit the nail on the head. If you don’t want to constantly be on the treadmill, don’t focus on the front end. The further down the stack you go, the slower technology moves and the easier it is to stay relevant.

Besides, just from looking around, front end pays less and it’s easy for most companies to find cheap “good enough” front end developers.

As far as being “lucky”, it’s not luck. If I see my employer’s stack falling behind the market, it’s time to jump ship. Why would I work at a company all day and then come home at night trying to keep myself marketable instead of just changing jobs?

There is usually a job out there where the “must haves” are $old_tech and the “nice to haves” are $new_tech, rinse and repeat.

You could always take the r/cscareerquestions tact and “learn leetCode and work for a FAANG” (note sarcasm).


As a counterargument (or perhaps as a corollary), backend programming is also a treadmill these days. It may not have the churn of the world of front-end Web programming, but the world of backend programming in 2020 is quite different from that of 2010. As someone who does mostly backend work, within the past decade I've seen the rise of cloud services, non-relational databases, the Hadoop ecosystem, distributed storage systems, containers, the increased use of languages outside of C/C++/Java for systems programming (e.g., Python, Go, Rust, Clojure, Scala), machine learning, and CUDA, among many other technologies that were either nonexistent or were in their infancy in 2010. A person who stayed stuck in the world of 2010 might be overwhelmed by how much has changed in the past decade. Heck, even C++ has changed dramatically over the past decade; someone coding in C++98 would need to get up to speed with C++11's fundamental changes as well as changes added in later versions.

Now, there are some levels that haven't changed as much in the past decade. The kernels of today's most widely used operating systems are still written in C, and x86-64 still remains the dominant instruction set despite an increased challenge by ARM64 and the possibility of RISC-V. Someone writing kernel-level code in 2010 would feel at home today in 2020, notwithstanding the natural kernel code changes that have always happened. However, the job markets for kernel developers, compiler developers, and those writing low-level system software are much smaller than the job market for backend programmers overall, and it's possible that a laid-off low-level systems software engineer would have to get up to speed in all of the advances that happened in higher levels of the stack in order to more easily find another job.


> As someone who does mostly backend work, within the past decade I've seen the rise of cloud services, non-relational databases, the Hadoop ecosystem, distributed storage systems, containers, the increased use of languages outside of C/C++/Java for systems programming (e.g., Python, Go, Rust, Clojure, Scala), machine learning, and CUDA, among many other technologies that were either nonexistent or were in their infancy in 2010.

You've missed what is maybe the worst offender - kubernetes and microservices.


Kubernetes is not as big as the hype makes it out to be.

Microservices have been around for over a decade. They were called service oriented architecture and instead of REST with JSON they sending XML.


I've got a feeling that in the SOA days the services were just "services", not microservices. I.e. people were not pushing for 200 lines of code microservices, like they do now. For example, I have a team of three devs at work who wrote a backend of around 10 microservices... Total cargo cult.


My first professional programming job was in 1996. I was writing C for DEC VAX and Stratus VOS mainframes and reading about undefined and implementation defined behavior on Usenet in comp.lang.c. My second job three years later was writing C and C++ for Windows computers along with MFC and some standard cross platform C++ using the STL.

Recently, there was a blog post submitted on HN with “trick” questions in C where they ask you to describe what the output would be. I immediately recognized all of the code would result in undefined or implementation defined behavior. I call myself still keeping abreast of the latest C and C+ standards out of morbid curiosity even though I have used neither in over a decade, changes have been glacial compared to the front end.

within the past decade I've seen the rise of cloud services*

I know AWS pretty well, from the development, Devops, and even operations/networking side. I hate to be “that guy” but for the most part the cloud is just someone else’s computer where they manage databases, messages services, caching services, monitoring etc.

The largest changes in mindset were infrastructure as code, “cattle vs pets” and immutable architecture.

non-relational databases, the Hadoop ecosystem, distributed storage systems, containers,

Especially in the cloud, distributed storage systems is just an API call. NoSQL databases do require a change in mindset, but the world is still dominated by RDMS’s. In fact they are still dominated by the same three or 4 as they were in 2010 - SQL Server, Oracle, Mysql and Postgres. I’ve been going back and forth between Mysql and Sql Server for almost two decades and six jobs.

the increased use of languages outside of C/C++/Java for systems programming (e.g., Python, Go, Rust, Clojure, Scala)

Besides Python, you can still safely ignore all of those languages and just stick with C/C++, C# and Java and be good. Those were the same languages that were in demand when I (belated) jumped back in the job market in 2008. The other are all niche languages in the grand scheme of things. Go is starting to pop up more admittedly.

machine learning, and CUDA, among many other technologies that were either nonexistent or were in their infancy in 2010.

You can also safely ignore those and still be marketable.

As far as low level developers, the submitter seems to be a standard everyday “enterprise developer” (no insult intended , I am too). So low level code, CUDA and machine learning wouldn’t be relevant.


I can’t believe how cloud has been so popular and every company is pushing towards cloud and I totally agree with you cloud is basically someone else computer but people make it seem like its something big You know what i mean


Half the reason that the cloud is a big deal to developers is that you can avoid administering servers.

The other half is that you can avoid server administrators.


> The further down the stack you go, the slower technology moves and the easier it is to stay relevant.

I don't think this can quite be generalized. Front-end web certainly moves the fastest, but does Windows application development move faster than lightweight microservices? I doubt it.


Lightweight microservices are still overwhelmingly a Java, C# and Javascript. The technology hasn’t changed.


Why does the front end change faster?


You don't manage your career by programming.

You manage your career by being able to deliver projects or at least participate in meaningful manner and then building meaningful work-relationships.


Many people I know like this found a stable job in government or a large corporation and for the most part states put there. The may have had a few other jobs, but like you mention it's all enterprise stuff which moves slow enough.

They also have tend to have different motivations, whether that be a family or unrelated hobbies.

Tbh sometimes I envy people like this and sometimes I dont.


I’m kind of like this... been at current employer around 8 years.

Motivation: very strictly 9-5 which is great for my toddler kids and personal side interests and being able to keep current even if employer is boring tech. Healthy compensation means I will be mortgage free in 2 years.

Profitable large company that will easily weather the coming storms (have always been tight with budgets), and not worried about layoffs since in growing + future investment part of company.

If it all goes to shit, once I own my house outright I don’t care, since wife owns a recession proof revenue producing business on which we can survive if I need to retool (can’t survive on only that while we have a mortgage payment on solely that income tho).

Many reasons for something like this.


That’s fine until the large corporation lays them off, they find their skillset is not marketable and then they start whining about “ageism”. (I’m in my mid 40’s and I’m still a developer FWIW)


I don't want to spend all my time honing one skill (programming). So I chose a job that is constantly challenging to keep me stimulated intellectually, and moved from frontend to backend/infra (took 18 months). Learning how to get up to speed with whatever technology being thrown at you is the skill one should develop, and not focus on specific framework or programming language per se, because they come.

Also, programming is not the only skill that will help you in your career as an engineer.


> "I was thinking that I needed to in order to stay on top with marketable skills."

These people are fine not being on top and are consciously or unconsciously accepting the risk of falling off the rails. The risks have been fairly low for the past decade or so since software's role has just kept growing and growing in society and, at least for the foreseeable future, that growth doesn't seem likely to slow significantly.

Not a strategy I'd pursue but it doesn't seem like an unreasonable bet.


You can take risks on the job and learn as you go. If you have a good manager they will take your career goals into mind when deciding what you are doing on a per-sprint basis.

A great manager will allocate sprint points to you exploring a new area of technology as long as you can somehow realize value for the business as a result of that exploration.


A fantastic manager doesn't make you assign points to your work to fit inside an arbitrary time box.


I work on side projects if it's something that interests me. Otherwise, I try not to make programming my life. I have other interests in politics, film, etc. that I would also prefer to pursue.


At least for me "programming" is never 100% of my day to day work.

There always research tasks and proof of concepts that allows you to learn and try out new things.


Well, the tools I learned to use are not just evaporating. They are still in use at many places and will be for the near term future. New things I tend to learn on the job. (Some examples: react hooks, webpack & build tooling, graphql)

I guess we are now entering into a new world, so we will see. But beside that, there is so much demand and opportunity, I don’t see why I should fear for my career. What are “the rails” and what does falling off them look like?

The other thing is, new stuff tends to just be repackaging of old stuff. Once you’ve seen a couple cycles of this you stop worrying about it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: