> Just to be clear, are you saying the only life work that you can find fulfillment in is work that can be perfectly automated and handled by AI?
No. I'm not saying that applies to me, but it may be getting dangerously close to many people. During my career, I've done CS, EE, controls, optics, and now I teach high school.
I do worry about CS in particular, though. If one's happy place is doing computer science, that's getting pretty hard.
LLMs feel to me like a 60th percentile new college grad now (but with some advantages: very cheap, very fast, no social cost to ask to try again or do possibly empty/speculative work). Sure, you can't turn them loose on a really big code base or fail to supervise it, but you can't do that with new graduates, either.
I worry about how 2026's graduates are going to climb the beginning of the skill ladder. And to the extent that tools get better, this problem gets worse.
I also worry about a lot of work that is "easy to automate" but the human in the loop is very valuable. Some faculty use LLMs to write recommendation letters. Many admissions committees now use LLMs to evaluate recommendation letters. There's this interchange that looks like human language replacing a process where a human would at least spend a few minutes thinking about another human's story. The true purpose of the mechanism has been lost, and it's been replaced with something harsh, unfeeling, and arbitrary.
No. I'm not saying that applies to me, but it may be getting dangerously close to many people. During my career, I've done CS, EE, controls, optics, and now I teach high school.
I do worry about CS in particular, though. If one's happy place is doing computer science, that's getting pretty hard.
LLMs feel to me like a 60th percentile new college grad now (but with some advantages: very cheap, very fast, no social cost to ask to try again or do possibly empty/speculative work). Sure, you can't turn them loose on a really big code base or fail to supervise it, but you can't do that with new graduates, either.
I worry about how 2026's graduates are going to climb the beginning of the skill ladder. And to the extent that tools get better, this problem gets worse.
I also worry about a lot of work that is "easy to automate" but the human in the loop is very valuable. Some faculty use LLMs to write recommendation letters. Many admissions committees now use LLMs to evaluate recommendation letters. There's this interchange that looks like human language replacing a process where a human would at least spend a few minutes thinking about another human's story. The true purpose of the mechanism has been lost, and it's been replaced with something harsh, unfeeling, and arbitrary.