Hello HN! I'm a Staff Software Engineer with a variety of backend experience. I helped Discord scale its WebRTC system and helped guide its media infra (blog post: https://discord.com/blog/how-discord-resizes-150-million-ima... ). Currently open to work where I can put my skills to use and help get your projects into production.
Some of my personal and working traits: I'm a big fan of data-driven decision making. I think the best time to think about "how are we going to deploy this" is before any code gets written. I think empathy is underrated and I prefer over-communication to under-.
Powering the SSD on isn't enough. You need to read every bit occasionally in order to recharge the cell. If you have them in a NAS, then using a monthly full volume check is probably sufficient.
It would surely depend on the SSD and the firmware it's running. I don't think you can entirely count on it. Even if it were working perfectly, and your strategy was to power the SSD on periodicially to refresh the cells, how would you know when it had finished?
NVMe has read recovery levels (RRLs) and two different self-test modes (short and long) but what both of those modes do is entirely up to the manufacturer. So I'd think the only way to actually do this is to have host software do it, no? Or would even that not be enough? I mean, in theory the firmware could return anything to the host but... That feels too much like a conspiracy to me?
Huh. I wonder if this is why I'd sometimes get random corruption on my laptop's SSD. I'd reboot after a while and fsck would find issues in random files I haven't touched in a long time.
If you're getting random corruption like that, you should replace the SSD. SSDs (and also hard drives) already have built-in ECC, so if you're getting errors on top, it not just random cosmic rays. It's your SSD being extra broken, and doesn't bode too well for the health of the SSD as a whole.
I bought a replacement but never bothered swapping it. The weird thing is the random corruption stopped happening a few years ago (confirmed against old backups, so it's not like I'm just not noticing).
It's quite possible. Some SSDs are worse offenders for this than others. I have some Samsung 870 EVOs that lost data the way you described. Samsung knew about the issue and quietly swept it under the rug with a firmware update, but once the data was lost, it was gone for good.
I ran into this firmware bug with the two drives in my computer. They randomly failed after a while -- and by "a while" I mean less than a year of usage. Took two replacements before I finally realized that I should check for an fw update
It found problems in the tree - lost files, wrong node counts, other stuff - which led to me finding files that didn't match previous backups (and when opened were obviously corrupted, like the bottom half of an image being just noise). Once I found this was a problem I've also caught ones that couldn't be read (IOError) that fsck would delete on the next run.
I may not have noticed had fsck not alerted me something was wrong.
But metadata is data too, right? I guess the next question is, would it be possible for parts of the FS metadata to remain untouched for a time long enough for the SSD data corruption process to occur.
Unfortunately, yes. Dropping a magnet onto a car and pulling it off, especially if not recently cleaned, will damage the paint to some degree. Maybe not enough for an average person to notice, but you really shouldn’t do this to other people’s cars.
Some people will get snide about anyone who cares about their car’s paint, but as someone who once bought a car I had to save a long time for and spent a lot of time with car care products I would be very sad if I saw you drop a magnet on to it and then pull it off without a second thought. Please don’t.
The "unpainted" parts are galvanized. Galvanization is essentially zinc paint (with no dye). For the painted parts, the paint serves the same purpose, which is why it's important not to mess it up.
I'm curious, how did you make it this far in life without realizing that paint is delicate and scratches easily? Do you have untreated brick walls in your house or something?
It won't really matter all that much, but it will have done more than 0 damage to the paintwork (since metal is hard and paint is soft). Worth noting that drivers are touchy and emotional, and can't be trusted not to murder you over perceived slights, so it's safest to stick to doing nothing. Stuff something under the windscreen wipers if you really must, and even that is risky.
Unless the cars are perfectly washed and clayed, even running a clean finger over a car is likely to introduce scratches. I just wouldn’t ever touch someone’s car.
You can look up people even trying to detail their cars to make them cleaner and end up leaving “love marks.” It doesn’t matter how soft the thing you’re using is. It’s because the car has contaminants on it and by rubbing anything on the car, those contaminants end up scratching everything. It’s like when you’re at the beach and you’re trying to remove sand off your skin. You’re probably not aggressively rubbing it off or using much pressure but it still hurts. It’s the same with cars, it’s just that the rocks aren’t as visible to you. They will leave swirls and scratches though… which become noticeable.
I’ve had people just lean against my car when it wasn’t completely clean and completely ruin the paint requiring an entire 5 stage detail.
Yeah we are talking pathological territory here. Car paints need less love than their owners need therapy if they have to "detail" their car every time a cat jump on the hood to enjoy the warmth.
Cars spend a significant amount of time outside and they depreciate so quickly it just doesn't matter. One shouldn't expect a paint to stay perfect the same way we expect our skin to wear and age over the years.
I don't even know what a 5 stage detail means but I can safely say you are overreacting. A car is just a tool and a rando putting a fridge magnet or leaning against your car once in a while is just completely negligible compared to the amount of shit a paint is exposed to when driving it. Sand and dirt do not ask for your permission either.
As long as the car is dirty, then contact with it can damage the top coat. This is a lot more true if you need to drag or scrape the magnet to remove it.
HN readers are, as an average, high on technical know-how and bad at social skills and reading the room. What you're seeing is the natural outcome of that.
The very light moderation (that even shows dead comments from banned accounts) and clean, minimal frontend with essentially no restrictions on creating throwaways also makes it pretty attractive for "my first AI app" experiments. Ever since GPT 3 was released I see a graveyard with a scattering of dead, green, obvious LLM replies on most articles, sometimes with account names like "accounttest14" that don't even try to hide it.
Windows 10 EOL is probably helping to churn a lot of aging Intel chips out here. I can't imagine anyone in the know is building a new desktop with an Intel anything in it these days, either.
Unless you need to use AutoCAD, their software have garbage level optimization on amd cpu. It's probably the only software you can see an intel i7 series cpu beat amd r9 by a big margin.
Probably they're using the Intel MKL library for their linear algebra (which is severely gimped on AMD - SIMD is disabled and only the scalar fallback code runs).
If they've wrote SIMD code themselves then the gap between the two shouldn't be big (AMD's are actually better for SIMD nowadays, since the recent models support the AVX-512 instruction set while Intel ended support for that due to the P/E core split fiasco.)
Windows 10 will be the last msft os I ever use. I rebuilt using AMD CPU/GP booted up Fedora 42 and I have never had to run a single shell command to get anything to work. I don't even notice my OS. Work, games, local models (this one still takes some tweaking but is better), all work fine
As a side note, Intel's discrete GPUs are also famous for high-quality video transcoding - it was quite popular for streamers who needed a second helper PC only for OBS streaming.
It has been, for a long time, the latest generation Intel CPU with a 2xxK or 2xxKF model number these used to be "i7" models now there's just a 7, I'm very vaguely annoyed at the branding change.
It would be hard for anybody to convince me that there is a better price|performance optimum. I get it, there was a very disappointing generation or two a few years ago, that hasn't put me off.
The dominance of Apple CPUs might be putting me off both Intel and AMD and consider only buying Apple hardware and maybe even doing something like Linux running on a Mac Mini in addition to my MacOS daily driver.
FYI www.cpubenchmark.com is a running joke for how bad it is. It’s not a good resource.
There are a few variations of these sites like userbenchmark that have been primarily built for SEO spam and capturing Google visitors who don’t know where to go for good buying advice.
Buying a CPU isn’t really that complicated. For gaming it’s easy to find gaming benchmarks or buyers guides. For productivity you can check Phoronix or even the GeekBench details in the compiler section if that’s what you’re doing.
Most people can skip that and just read any buyers guide. There aren’t that many CPU models to choose from on the Pareto front of price and performance.
> For productivity you can check Phoronix or even the GeekBench details in the compiler section
I guess the reason people prefer something like cpubenchmark, is because it seems way easier to get an overview / see data in aggregate. GeekBench (https://browser.geekbench.com/v6/cpu/multicore) for example just puts a list of all benchmarks, even when the CPU is the same. Not exactly conductive for finding the right CPU.
Userbench is openly mega biased and fudges their own test scores against AMD, it’s so bad it shouldn’t even be listed in search results,
There are also many criticisms against CPUbennchmark that are much more minor like its over simplified testing leading to weird anomalous score gaps between extremely similar CPU’s.
For the average consumer, I think cpu benchmark is fine and probably as good as you can ask for without getting into the weeds which defeats the purpose really.
>FYI www.cpubenchmark.com is a running joke for how bad it is. It’s not a good resource.
That's not the prevailing opinion at all. Passmark is just fine and does a lot to keep their data solid like taking extra steps to filter overclocked CPUs. Then you go on to recommend GeekBench??? Right...
- generic benchmarks don’t pick up unique CPU features nor they pick up real world application performance. For example, Intel has no answer to the X3D V-cache architecture that makes AMD chips better for gaming.
- You can’t really ignore motherboard cost and the frequency of platform socket changes. AMD has cheaper boards that last longer (as in, they update their sockets less often so you can upgrade chips more and keep your same board)
- $400 is an arbitrary price ceiling and you’re not looking at dollars per performance unit, you’re just cutting off with a maximum price.
- In other words, Intel chips are below $400 because they aren’t fast enough to be worth paying $400+ for.
- If you’re looking for integrated graphics, you’re pretty much always better off with AMD over Intel
I got a 265kf and motherboard for 350. Plenty fast and saves money for the real issue which is GPU costs. Thankfully B580 is actually a pretty good deal as well at 250 compared to green or red options. Team blue has some good deals out there really if you aren't tied to a team color.
When i read "here's how i choose..." At no point did i engage with it under anything other than "this is what some random dude does once every 5 years" Let him pick his cpu how he does it. Youre overreacting, and frankly over emphasizing things that dont matter like needing vcache or avx512 or misapprehending his own price points
> $400 is an arbitrary price ceiling and you’re not looking at dollars per performance unit, you’re just cutting off with a maximum price. So if there’s a $430 AMD CPU that’s 20% faster you’re going to forego that better price per performance value just because it’s slightly above your price target.
My choice of CPU currently has the best value / performance on this benchmark aside from two very old AMD processors which are very slow and just happen to be extremely cheap. No new AMD processors are even remotely close.
It's also currently $285 no top tier performers are even close except SKUs which are slight variations of the same CPU.
> benchmarks don’t pick up unique CPU features nor they pick up real world application performance. For example, Intel has no answer to the X3D V-cache architecture that makes AMD chips better for gaming.
Happy to be convinced that there's a better benchmark out there, but if you're trying to tell me it's better but in a way that can't be measured, I don't believe you because that's just "bro science".
> If you’re looking for integrated graphics, you’re pretty much always better off with AMD over Intel
I never have been looking for integrated graphics, sometimes I have bought the CPU with it just because it was a little cheaper.
> You can’t really ignore motherboard cost and the frequency of platform socket changes. AMD has cheaper boards that last longer (as in, they update their sockets less often so you can upgrade chips more and keep your same board)
I've always bought a new motherboard with a CPU and either repurposed, sold, or given away the old CPU/motherboard combination which seems like a much better use of money. The last one went to somebody's little brother. The one before that is my NAS. There's not a meaningful difference to comparable motherboards to me, particularly when the competing AMD CPUs are nearly double the cost or more.
It’s hard to take you seriously if you’re going to claim equivalent AMD processors cost double or more.
Your example of tossing your motherboard away is not a very good one here. That was your choice to act illogically. My AMD AM4 motherboard started with a Ryzen 1600, 3600, and now runs a 5600X3D.
Basically I’ve had this same motherboard for something like 6 or 7 years and the performance difference between a Ryzen 1600 and 5600X3D is completely wild. I’ve had no need to buy a new board for the better part of a decade. If you’re buying a new board with every processor purchase that’s a huge cost difference.
When I say that generic benchmarks are bad I mean that cpu benchmarks like the one you are just now linking are bad. You need more practical benchmarks like in-game FPS, how long a turn takes in Stellaris, how long it takes to encode a video or open an ZIP file, etc.
That is where the X3D chips play in as well. You might be able to buy an Intel chip with more cores and better productivity performance, but if you’re eyeing gaming performance like I imagine most desktop DIY builders are, you’d rather get better gaming oriented performance and sacrifice some productivity performance.
If you are gaming and buy a 9800X3D, Intel literally doesn’t not make anything faster at any price. You can offer Intel $5,000 and they won’t have anything to sell you that goes faster at playing games.
At lower price points, AMD still ends up making a lot of sense for their long-supported sockets, low cost boards, better power/heat efficiency, and X3D chips performing well in gaming applications.
Just buy an old one. Unless you are doing some sort of cutting edge work, an old one works fine. It's crazy how cheap they are. I assume because Apple users always like to churn to the newest thing.
I see the current base Mac Mini going for $499 new, but that's 16gb of unified ram and a 256 ssd. I'm currently using 17.5gb of memory on win11, but most of that is Brave with a ton of extensions loaded with many tabs. I'd be using the Mac for typical office stuff with some occasional programming probably with JetBrains IDEs. I'd like to do some AI stuff too, my current laptops are way too slow.
it does feel like, when you click the, "pay 400$ more for a 30$ hardware upgrade" button, that tim apple himself is laughing at me knowing their siren song has already worked and I am at their mercy, wallet open...
Running 32GB RAM and 1TB SATA SSD with Windows 10, like a mad scientist, on my thus upgraded 15-years-old Sony Viao laptop..
SATA and SDRAM are backwards compatibible so a couple of years ago, I put in a new 1 TB SATA SSD drive in the old SATA1 slot, and two cheap DDR4 3200+ Mhz SDRAM chips in the RAM slots; I can upgrade again a few years later). This Sony Viao notebook (for it is a cute little laptop) now purrs like a Jaguar waiting to be unleashed. Dual booting Windows 10 and Mint Linux - OS boots in few seconds - and everything feels so snappy to work there.
Meanwhile, my Apple Mac Mini 2012 (Intel CPU) - which needed extraordinary efforts by me to make it triple boot MacOs, Windows 10 and Linux (trust Apple to make it hard to install other OSes on an Intel CPU PC) - is slow and fussy because of its meagre RAM and old HDD (not SSD). But the Apple service center refused to upgrade this Mac Mini to new RAM and new SSD, citing Apple policies to not allow such upgrades. Apple has made it quite hard to custom upgrade such iDevices, so this little PC is lying unused in my cupboard, waiting for the rainy day when I'll get the courage to tinker it by myself to upgrade it. And even if I did upgrade the hardware, this Mac Mini can only be upgraded to MacOS Catalina, and it won't get security upgrades, because Apple has stopped supporting it.
> And even if I did upgrade the hardware, this Mac Mini can only be upgraded to MacOS Catalina, and it won't get security upgrades, because Apple has stopped supporting it.
Your comment mostly makes sense but this is a weird mention when Windows is even worse on this now, Win11 not supporting much more recent machines.
I don't even want to fall down the rabbit hole of installing MacOS on a normal laptop again and my old 2014 Thinkpad with 8gb of ram plus 256gb ssd isn't going to light the world on fire performance wise.
I wish PCs had a unified GPU with 400..1000 GB/s bandwidth to the main memory. Up to 256 GB (or even 512 GB in Mac Studio). It's nice for AI. Thus staying on Macs, at least for now.
You’re not missing out on a lot. Coming from someone who has used their products for many years now. Their products have more compromises and trade-offs now than they did during Apple’s Intel era.
What you will tangibly miss is low noise, low power draw hardware and very, very specific workloads being faster than the cutting edge AMD/Nvidia stack people are using today.
I have a work-issued M3 MacBook Pro, and at home my daily drivers are a Ryzen 9 3900 PC (still on Windows 10) and a Framework 13 laptop with a Ryzen 5 7640U running Windows 11. The hardware on my MacBook Pro is fantastic; I get amazing battery life that lasts far longer than my Framework 13, and the performance is excellent. I also love my MacBook Pro's build quality.
However, the reason my personal laptop is a Framework 13 and not a MacBook Pro is because I value upgradability and user-servicability. My Framework has 32GB of RAM, and I could upgrade it to 64GB at a later date. Its SSD, currently 1TB, is also upgradable. I miss the days of my 2006 Core Duo MacBook, which had user-serviceable RAM and storage. My Ryzen 9 3900 replaced a 2013 Mac Pro.
Additionally, macOS doesn't spark the same type of joy that it used to; I used to use Macs as my personal daily drivers from 2006 to 2022. While macOS is less annoying than Windows to me, and while I love some of the bundled apps like Preview.app and Dictionary.app, the annoyances have grown over the years, such as needing to click a security prompt each time I run lldb on a freshly-compiled program. I also do not like the UI directions that macOS has been taking during the Tim Cook era; I didn't like the changes made during Yosemite (though I was able to live with them) and I don't plan to upgrade from Sequoia to Tahoe until I have to for security reasons.
Apple's ARM hardware is appealing enough to me that I'd love to purchase a M4 Mac Mini to have a powerful, inexpensive, low-power ARM device to play with. It would be a great Linux or FreeBSD system, except due to the hardware being undocumented, the only OS that can run on the M4 Mac Mini for now is macOS. It's a shame; Apple could probably sell more Macs if they at least documented enough to make it easier for developers of alternative operating systems to write drivers for them.
For the 7800x3d and 9800x3d being really good CPUs? I worked on Cinebench and its rendering engine, so I'll often go by its single- and multi-threaded results, and in a past life I worked on Indigo Renderer and find IndigoBench still works great too: https://indigorenderer.com/indigobench
reply