> I wonder how many out there seriously think we could ever completely rid ourselves of the CPU. It seems to be a rising sentiment.
This sentiment is not a recent thing. Ever since GPGPU became a thing, there have been people who first hear about it, don't understand processor architectures and get excited about GPUs magically making everything faster.
I vividly recall a discussion with some management type back in 2011, who was gushing about getting PHP to run on the new Nvidia Teslas, how amazingly fast websites will be!
Similar discussions also spring up around FPGAs again and again.
The more recent change in sentiment is a different one: the "graphics" origin of GPUs seem to have been lost to history. I have met people (plural) in recent years who thought (surprisingly long into the conversation) that I mean stable diffusion when talking about rendering pictures on a GPU.
Nowadays, the 'G' in GPU probably stands for GPGPU.
The dream I think has always been heterogeneous computing. The closest here I think is probably apple with their multi-core cpus with different cores, and a gpu with unified memory. (someone with more knowledge of computer architecture could probably correct me here).
Have a CPU, GPU, FPGA, and other specific chips like Neural chips. All there with unified memory and somehow pipelining specific work loads to each chip optimally to be optimal.
I wasn't really aware people thought we would be running websites on GPUs.
> How feasible is it to make a crt from parts? [...] I've never seen a DIY CRT kit before.
The closest thing that springs to mind: A friend of mine once drilled a hole into an empty Vodka bottle, stuck two wires in it (one at each end), a hose adapter for a vacuum pump, "sealed" the whole thing with a hot glue gun and hooked it up to several scavenged microwave oven transformers in series. Yes, the output was rectified and capacitors were also involved.
I suppose rearranging the electrodes (using a piece of sheet metal with a hole in it; both fed through the neck of the bottle) and wrapping the sides of the bottle with 4 strips of aluminium foil could get you a beam and some crude deflection control. Not sure tough what you would coat the end of the bottle with, but I guess vacuum coating would be applicable.
If that sounds absolutely insane to you, I'd wholeheartedly agree.
At least to my ears, trying to build a CRT from first principles, combined with learning-by-doing and learning-EE-from-youtube-tutorials, sounds like a fast path to end up either dead or in a permanent care facility. Not exactly something I'd hand out in beginner-friendly kit form.
from the turn of the previous player, your intended move will typically be more complicated to visualize (at least for children) - this is what this game is about - so children tended to name a "Eckkarte" an "Ätschkarte".
Maybe that game exists, but it's an old word and you can find references for it that are hundreds of years old. Its meaning fits to the browser. Your statement that it's just some kind of reference to some special game is not correct.
I'm pretty sure if you really want to, you could do something like this as a hobbyist with a Pentium right now.
Instead of futzing with wires on a breadboard you could simply designing a PCB up front, throw the design over the fence at JLC or PCBWay, insert coin, wait patiently at the mailbox, solder your scavenged Socket 7 onto the board.
The days of toner transfer and aquarium pumps are already long gone. Getting production quality, one-off, multi layer PCBs done as a hobbyist is dirt cheap these days, no government budgets required.
> you could simply designing a PCB up front, throw the design over the fence at JLC or PCBWay, insert coin, wait patiently at the mailbox
It blows my mind that I can use free-as-in-beer Free-as-in-Speech software to design a PCB, email it to a dude in China, and get a finished working professional-looking PCB back in my hand within a week, for the price of a couple of coffees. And if I want the components stuck on too, it'll cost a little extra, maybe three coffees it costs now.
If I want it really quickly then for the price of a decent takeaway curry I can have it flown over next day. What the actual hell?
Edit: the slowest part of "next day" is when it hits the UK, and if I could guarantee it just gets delivered to DHL's Edinburgh depot I could drive down there in two hours.
The reality is that's just how cheap commodity circuit construction is, and even small shops in the US sometimes approach that low cost, and we've been paying crazy markup on electronics for decades. That dirt cheap board is so profitable it is using air freight to get back to you. It is literally burning money just for convenience, yet that is the "cheap" option.
Electronics cost a lot to manufacture in the 70s, but is entirely automated now but for "reasons" we have only seen a small part of that savings.
When you buy the "Cheap" version on AliExpress, they are still making a healthy margin, yet Americans will happily buy the exact same product off Amazon for next day shipping for 10x the cost and think they are getting a "deal"
This extends to cars as well, with the F150 costing as little as $20k to build, even with "Expensive" very unionized and well compensated labor. The higher market trims only cost a little more to make but take in far higher profit margins. How much of China's supposedly "Subsidized" car price (as if the US doesn't do anything to subsidize cars) is just a lower profit margin?
Things should be way cheaper to western consumers. Where does all that extra money go? "Marketing and administration", basically bloated executive suites, bloated middle management, and the pockets of Meta, Google, AWS, and Apple. Oh gee, those exact companies seem absurdly wealthy and are basically responsible for all economic growth in the past few decades.
> even small shops in the US sometimes approach that low cost
It's ludicrously expensive to ship things to and from the US though, and since they're now paying some insane markup because no-one understands what tariffs are the prices have got even sillier.
I've seen a similar project a while ago and thought this was about the same thing at first: [1][2]
Both essentially built a DIY chip tester for a 286 and both built around a Harris 80C286.
If I understood it correctly, the goal behind this project seems simulating the rest of the PC, purely for the challenge and learning experience, documenting the process of building the chip tester (and getting mildly philosophical in the process).
The other project was more directly interested in the 286 itself, undocumented instructions, corner cases in segmentation behavior, instruction cycle timing, etc. and also trying to find out if there are any difference between the Harris and Intel variants.
Managed code, the properties of their C# derived programming language, static analysis and verification were used rather than hardware exception handling.
"Operating System Principles" (1973) by Per Brinch Hansen. A full microkernel OS (remake of RC-4000 from 1967) written in a concurrent dialect of Pascal, that also manages to make do without hardware protection support.
In TempleOS, everything runs in ring 0, but that's not the same as doing protection in software (which would require disallowing any native code not produced by some trusted translator). It simply means there's no protection at all.
> I just found the acting in the first season really, "soap opera" like. I'm not sure how to describe it better.
When I first started watching, season 1 with its gratuitous 90s CGI, the dramatic musical cues, and Michael O'Hares rather stiff, wooden demeanor reminded me a lot of those live action cut scenes that some early CD-ROM games had. I remember thinking at first, that this is probably the kind of thing the local basement theater troupe would pull off if they were suddenly told to make a TV show.
> It's still one of my all time favorite shows.
Fully agree. If you haven't seen it yet, I'd highly recommend as well.
Some wishfully frame HN as science and tech, others as views from smart people on complex issues of topical importance, but regardless, political overlap with science or complexity causes flagging "because politics". Forces that degrade discussion are high on political topics, but ... sheesh. A forum with high ability to contribute to rational discourse on complex issues of importance is really hamstrung by this. /rant
I feel HN approaches politics the same the field economics does, they are not involved at all execpt bascically all things in the world heavily involve politics. Tech is no exception, not wanting to overriden but news though is not a crime but this is pretty impactful news even for the tech community.
This sentiment is not a recent thing. Ever since GPGPU became a thing, there have been people who first hear about it, don't understand processor architectures and get excited about GPUs magically making everything faster.
I vividly recall a discussion with some management type back in 2011, who was gushing about getting PHP to run on the new Nvidia Teslas, how amazingly fast websites will be!
Similar discussions also spring up around FPGAs again and again.
The more recent change in sentiment is a different one: the "graphics" origin of GPUs seem to have been lost to history. I have met people (plural) in recent years who thought (surprisingly long into the conversation) that I mean stable diffusion when talking about rendering pictures on a GPU.
Nowadays, the 'G' in GPU probably stands for GPGPU.
reply