Hacker Newsnew | past | comments | ask | show | jobs | submit | throwaway1280's commentslogin

This is an incredible achievement, but as a musician, I wish this would go die in a fire.

I'm a bedroom hobby musician with no dreams of ever making it big, but even so, I'm looking at the hours I'm spending trying to improve my skills and thinking what's the point, really, when I could just type in 'heavy metal guitar solo at 160bpm, A minor' and get something much much better?

I know there is value in creating art for art's sake. I've always been up against a sea of internet musicians, even when I started back in 2000. But there's just something about this that's much more depressing, when it's not even other people competing with me, but a machine which hasn't had to invest years of its life in practice to beat me.


As a fellow (hobbyist) musician, I feel you, but after doing a lot of introspection I realized that it's the art (and the process) that I really like, not (just) the end result (though that is of course a rewarding aspect). For example, jamming out to a kickin' song is fun, even if I'm just covering something. I also realized that my own ability to produce things isn't affected by this (as long as you don't want to make money on it). As someone who loves to play bass but is generally bad at writing bass riffs, I also see some fun potential to use AI to get bass tracks that go along with my main guitar riffs. I can always throw them out and rewrite from scratch later, or just iterate on them to get them where I like them. I do think I'll feel a bit lof a loss of "artistic purity" with doing something like that, but the more I think about it, the biggest reason that might bother me is because I'd feel judged by other musicians :-D


When Harmonix was in the concept phase for Guitar Hero, there were two slides in the presentation. The first was the "con" slide: novel game style risks not finding a market, technical challenges of the new style of gameplay, requiring peripherals for a game is a famous pathway to low-volume sales.

The next slide was labeled "pro," and it was just a picture of Jimi Hendrix on-stage mid-performance.

I'd submit to you the notion that even if the machine can create a billion billion iterations of music, it still cannot create what you will create, for the reasons you will create it, and that's reason enough to continue. Hendrix wasn't just "a guy who played guitar good." And a machine that could word-for-word and bar-for-bar synthesize "Foxy Lady" wouldn't be Hendrix.

Hendrix, also, can't be you. Nor you him.


It is embarrassing how much time I spent playing that dumb game instead of actually practicing a more versatile instrument.


I am not sure if this will be any help to you at all, but the technical skill of music - especially in (from your comment) heavy metal - has only ever been a tiny part of it. You probably know a guy in a local pub band who can note-for-note play every singly thing that Toni Iommi ever recorded, for instance.

There are hundreds of bands who play three-chord doom or mindless-shredding grind who just learned one thing and do it well, and who play to hundreds of people multiple times a week (often including me). We go to see these bands not to see what they can play, but to see what they are saying with what they play.

This is why I feel that I can never describe LLM-generated content as 'art'. Art is about the story. People will go and see a punk band who only know three chords if they play songs about things that resonate with them. Bit of a tangent, but this is the same reason that I genuinely believe that if you could bio-engineer a steak that tastes exactly like one from a well-looked-after cow from a notable breed and a good farm, most people would still prefer the cow. The story matters - the fact that a person put effort and experience into something really is important.

"This solo is sick" is a fun plus-point, but it doesn't matter if the song doesn't mean anything to you. If proficiency was the only thing that mattered then we'd all be listening to the worst kind of prog.


I can't see the future, but I imagine that the human art community may actually get more vibrant when divorced from being a way to make a living. Perhaps a return to something like a patronage system for the exceptional artists.

Open mics, music circles and concerts also remain untouched for the moment.


Like you, I am just a hobbyist making beats in my room. No expectations of ever being a real musician. But when I'm jamming and I create a beat and synth line, start adding other instruments and really get a song going, there is a feeling that I get that an generated song will never ever ever be able to recreate for me. It's like a rush, an almost a euphoric tingling (and no I'm not on drugs) that happens that almost feels like a runners high. No output from a prompt-driven AI algorithm would ever do that to me. That's the value I see in making art for arts sake, for practicing a craft and for trying to just get better at something for the sake of getting better at it.


I would suggest that instead of feeling demoralized by ai that you instead ask what can you offer musically that an ai cannot. I also would suggest trying to let go of the notion that music is primarily about achieving a certain level of technical proficiency. There are no limits on your growth musically unless you artificially constrain them because you are deluded by a technology to believe that you don't already have what you need inside of you.

Do you regularly play with other people? That is a good way to disabuse yourself of the notion that all that matters is technique.


Eh. I'm in exactly the same group she's talking about (born in 84, so at the upper end of millennial age) and my inheritance has started already. My parents have calculated that they have enough to live on until the time they expect to die, so they're passing any extra onto me early to avoid inheritance tax.

Personally, it kinda grates on me. I'm very much a soft socialist, who believes in everybody paying their fair share of tax. I've also derived a lot of satisfaction over the years in becoming self-sufficient[1] and being able to pay my own way without them having to prop me up. But it's their money and they're free to do with it as they wish.

As it stands, I'm doing the privileged thing that the author mentions, and 'refusing' it: it's going in an investment account and I'll continue to live within the means that I earn myself. But I know that that still means that I'll never be afraid of having to live on the dole, or do a job that I utterly hate.

In any case, even without that money, as an ex-FAANG developer I'm still significantly more privileged than most of my peers: I don't have any debt, I own my (fairly cheap) home outright, and I own a (very cheap) car and motorbike, also outright. Amongst my friends, most of whom are renters working in precarious jobs, I just don't mention money.

[1] How does that work with the socialism? Well - I'm happy to take from the government, because everybody can get the same, and I've repaid that with taxes over the years.


> Personally, it kinda grates on me.

Presumably there are (relatively) straight-forward solutions to this issue no? For instance; accept the money from your parents, invest it into an Index Fund without touching it, and when your parents do pass on, donate the inheritance to GiveDirectly or whatever charity you deem worthwhile.


If it grates you so much, give the money to someone or something else who would appreciate it more.


That sounds incredibly rough.

But as a simple counterpoint to this, I'm getting a lot out of therapy, as are most of my friends who are in therapy. It's not doing what I thought it would - it hasn't fixed my depression overnight, or cured my irritability - but it's providing a different framework to think about the world and relate to people, and I like who I am an awful lot more than I like the person I was.

It's expensive for me, at 60 bucks a session on a not-particularly-high wage, and I regularly revisit the idea of stopping, but I think I'm still getting something useful and insightful out of most appointments, so I'm keeping on going.


Just letting you know that I downvoted you not because of the advice - which is pretty good - but because of the link to Jordan Peterson, who is generally a pretty nasty pseudo-intellectual piece of work.


Thank you for your opinion. I figured a lot of people would have mixed feelings about him but I wanted to give credit to where the advice was coming from. Personally, I've gotten a lot of value from his work.


With enough of a benzo habit that he had to detox in an induced coma, besides.

Can happen to anyone, of course, and it's not necessarily a reflection on character in its own right, but it is worth taking some thought on the value of advice on how to keep your life together when it comes from someone who's demonstrated such an inability to keep his own life together.

Even on the simplest things - his constant line is "clean your room", and every time you see him on a Zoom interview from home it's wall-to-wall clutter in the background. Granted, "clean your room" is pretty uncontroversial, and is good advice - as is that you should pet a cat when you find the chance. But, in this as in everything else, epistemology matters: you do want to consider the source.


Yup. There was a famous memo from Jeff that went round years ago that read something like "I want to buy a kettle. I search for 'kettle' and I get thousands of products. That sucks - now I have to comparison shop for kettles, when I don't particularly care what sort of kettle I get. Why can't Amazon just tell me 'buy this one'?"

Amazon's choice arrived years after my tenure at the big A, and I'm not sure it's directly connected, but it definitely helps solve this problem.


That's a blindfold, not a solution.

If I told a personal assistant "I want any kettle", they would know there's an implied <among buy-it-for-life high-quality stainless steel in America with a non-garish finish>.

They would also know me well enough to provide options when it's a non-commodity context, ie pressure cookers where the same implied values hold but I want to actually made the feature tradeoffs as a poweruser.

YouTube went down then global optimization path, and the experience was terrible...


Yes, indeed.

I've had depression for nineteen years, and because of it I have giant holes in my memory - I know that I went places and did things when I was growing up, but I have next to no memories of any of them. My short-term memory malfunctions a lot - I can go through an entire day, and have zero memories of it the next day. I suspect I'd lose a few more memories with ECT but, really, I would happily pay those memories for having a life where I could actually function.


Yeah. I did this once a few years ago, and it was quite unpleasant. Did get it done in the end, but it definitely put my team off looking for any other useful NPM packages.

I wonder if it's any more streamlined now?


There's some internal builds tools that I can vouch for. If you're still at Amazon, feel free to ping me at dbarsky@ and we can chat.


Ex-Amazon engineer of several years here.

This is a pretty interesting article, but it's important to know that Amazon's internal tooling changes pretty fast, even if it's mostly several years behind state-of-the-art.

Exhibit A: Apollo

Apollo used to be insane. It was designed for the use case of deploying changes to thousands of C++ CGI servers on thousands of website hosts, worrying about compiling for different architectures, supporting special fleets with overrides to certain shared libraries, etc etc. It had an entire glossary of strange terms which you needed to know in order to operate it. Deployments to our global fleet involved clicking through tens of pages, copy-and-pasting info from page to page, duplicating actions left right and centre, and hoping that you didn't forget something.

When I left, most of that had been swept away and replaced with a continuous deployment tool. Do a bit of setup, commit your code to the internal Git repo, watch it be picked up, automated tests run, then deployments created to each fleet. Monitoring tools automatically rolled back deploys if certain key metrics changed.

Auto scaling became a reality too, once the Move to AWS project completed. You still needed budgetary approval to up your maximum number of servers (because for our team you were talking thousands of servers per region!) but you could keep them in reserve and only deploy them as needed.

Manually copying Apollo config for environment setup was still kind of a thing though. The ideas of CloudFormation hadn't quite filtered down yet.

Exhibit B: logs

My memory's a bit hazy on this one. There certainly was a lot of centralized logging and monitoring infrastructure. Pretty sure that logs got pulled to a central, searchable repository after they'd existed on the hosts for a small amount of time. But, yes, for realtime viewing you'd definitely be looking at using a tool to open a bunch of terminals.

The monitoring tools got a huge revamp about halfway through my tenure, gaining interactive dashboarding and metrics drill-down features which were invaluable when on-call. I'm currently implementing a monitoring system, so my appreciation for just how well that system worked is pretty high!

Exhibit C: service discovery

Amusingly, a centralized service discovery tool was one of the tools that used to exist, and had fallen into disrepair by the time this person was working there.

This was a common pattern in Amazon. Contrary to the 'Amazon doesn't experiment' conclusion, Amazon had a tendency to experiment too well - the Next Big Thing was constantly being released in beta, adopted by a small number of early adopters, and then disappearing for lack of funding/maintenance/headcount.

I can't think of any time I hard-wired load balancer host names though. Usually they would be set up in DNS. We did used to have some custom tooling to discover our webserver hosts and automatically add/remove them from load balancers, but that was made obsolete by the auto-scaling / continuous deployment system years before I left.

As for the question of "can we shut this down? who uses it?" - ha, yes, I seem to remember having that issue. I think that, before my time, it wasn't really a problem: to call a service you needed to consume its client library, so you could just look in the package manager to see which services declared that as a dependency. With the move to HTTP services that got lost. It was somewhat mitigated over the years by services moving to a fully authenticated model, with client services needing to register for access tokens to call their dependencies, but that was still a work in progress a few years ago.

Exhibit D: containers

Almost everything in Amazon ran on a one-host-per-service model, with the packages present on the host dictated by Apollo's dependency resolution mechanism, so containers weren't needed to isolate multiple programs' dependencies on the same host.

Screwups caused by different system binaries and libraries on different generations of host were a thing, though, and were particularly unpleasant to diagnose. Again, that mostly went away once AWS was a thing and we didn't need to hold onto our hard-won bare-metal servers.

'Amazon Does Not Experiment'

Amazon doesn't really do open source very well. The company is dominated by extremely twitchy lawyers. For instance, my original employment contract stated that I could not talk about any of the technology I used at my job - including which programming languages I used! Unsurprisingly, nobody paid attention to that. That meant that for many years, the company gladly consumed open source, but any question of contributing back was practically off the table as it might have risked exposing which open source projects were used internally.

A small group of very motivated engineers, backed up by a lot of open-source-friendly employees, gradually changed that over the years. My first ever Amazon open source contribution took over a year to be approved. The ones I made after that were more on the order of a week.

Other companies might regard open sourcing entire projects as good PR, but Amazon doesn't particularly seem to see it that way. Thus, it's not given much in the way of funding or headcount. AWS is the obvious exception, but that's because AWS's open source libraries allow people to spend more money on AWS.

Instead, engineers within Amazon are pushed to generate ideas and either patent them, or make them into AWS services. The latter is good PR and money.

As for different languages: it really depends on the team. I know a team who happily experimented with languages, including functional programming. But part of the reason for the pushback is that a) Amazon has an incredibly high engineer turnover, both due to expansion and also due to burnout, so you need to choose a language that new engineers can learn in a hurry, and b) you need to be prepared for your project to be taken over by another team, so it better be written in something simple. So you better have a very good justification if you want to choose something non-standard.

Overall, Amazon is a pretty weird place to work as an engineer.

I would definitely not recommend it to anybody whose primary motivation was to work on the newest, shiniest technologies and tooling!

On the other hand, the opportunities within Amazon to work at massive scale are pretty great.

One of the 'fun' consequences of Amazon's massive scale is the "we have special problems" issue. At Amazon's scale, things genuinely start breaking in weird ways. For instance, Amazon pushed so much traffic through its internal load balancers that it started running into LB software scaling issues, to the point where eventually they gave up and began developing their own load balancers! Similarly, source control systems and documentation repositories kept being introduced, becoming overloaded, then replaced with something more performant.

But the problem is that "we have special problems" starts to become the default assumption, and Not Invented Here starts to creep in. Teams either don't bother searching for external software that can do what they need, or dismiss suggestions with "yeah, that won't work at Amazon scale". And because Amazon is so huge, there isn't even a lot of weight given to figuring out how other Amazon teams have solved the same problem.

So you end up with each team reinventing their own particular wheel, hundreds of engineer-hours being logged building, debugging and maintaining that wheel, and burned-out engineers leaving after spending several years in a software parallel universe without any knowledge of the current industry state-of-the-art.

I'm one of them. I'm just teaching myself Docker at the moment. It's pretty great.


Speaking of twitchy lawyers and Move to AWS... one of the weirdest things we had to deal with inside Amazon was that, for many years after AWS launched, we weren't allowed to use it because it "wasn't secure enough".

Given that we were actively shopping it around to major financial institutions at the time, doesn't that strike you as particularly hypocritical? :)


So wait, when I need to convince customers why AWS is secure for their data, I can't say "It's good enough for Amazon!"?


To clarify, an AWS customer has a shared responsibility to describe the security of their systems including how they use AWS tools, and in this respect Amazon is no different than other AWS customers.


GP's comment may have been true at one point, but AWS is extremely mainstream within Amazon now and has been for several years.


No, you can say hey amazon might use this if they do a security evaluation first.

It's totally cool for your data though, don't worry about it.

Amazon fucking sucks at dogfooding.


This is misleading. This had more to do with internal audit tools and guardrails available than it did the services themselves.


It's still highly restricted for many teams working with customer data


Your comment is better than the original article. Can we push this one to the top, HN?


Worked for Amazon. It was a big issue internally, but we never got a good answer to that question.


Yeah. Amazon loses a small but significant number of potential hires when they learn about the open source clause - often very good people, too. When I left they were actively working on trying to find a solution, as it had been a thorn in the side for years.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: