Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Indeed, starting with IBM's initial 5150 design, early PC graphics made cost, memory and capability trade-offs which would soon be seen as unfortunate from a graphics and gaming perspective. Although IBM specced the platform and chose Motorola's 6845 video display chip, I assign some blame to Motorola too for not having created a range of video chips with increasing capabilities to choose from. We'll never know if IBM would have ponied up a few dollars more for a chip with at least a 256 color palette or a few other niceties but it's always possible.

Strangely, Motorola did eventually decide to get serious about offering more capable graphics in the form of the RMS chipset but not until it was already too little and too late. They announced the RMS chipset in 1984 and tried to drum up interest among system designers but eventually cancelled it before release amidst lukewarm response and bugs in the early prototypes (https://retrocomputing.stackexchange.com/questions/10977/fat...). It certainly didn't help that other options like TI's 99x8 VDP chips were now getting cheaper and the pre-Commodore Hi-Toro company was shopping around their Amiga chipset to all the major consumer computer manufacturers in 1984.



IBM only gave maybe 1.0 shits about gaming, to the extent they needed "business graphics" like charts, and maybe just some extra fun shit. The primary competition was loads of CP/M "business micros" with not many real graphical games at all. IBM benchmarked the Apple II+ with a Z80 Softcard because that was the ultimate mullet machine, all the business software upfront, all the gaming party in the back. CGA was good enough for an Apple II game or a pie chart, and that's all they cared about.


+1 for describing the "Apple II+ with a Z80 Softcard" as "the ultimate mullet machine, all the business software upfront, all the gaming party in the back."

I agree with your point, the bar IBM was shooting for was set by existing popular microcomputers circa 1979. The only significant consideration for future growth/competition was seemingly that the established trend of RAM size growth would probably continue. At the time there wasn't really any established trend of progressive growth in graphics resolution or colors. Pre-Apple II examples like the Cromemco Dazzler for the Altair weren't fundamentally different than the Apple II and probably not even on their radar due to being barely out of the kit/hobbyist level.

I'll add that when considering the 5150's initial design, the "IBM" we're talking about isn't really "The IBM" but rather a sole skunkworks project located in a backwater division down in Boca Raton Florida intended as an experiment to learn more about these new microcomputers. Most of the rest of the traditional IBM management structure barely knew about it during development and those parts that did mostly ignored it. If 'mainstream IBM' had approached the PC as a real IBM project, it would have certainly been very different and probably unsuccessful (if it had managed to ship at all). As it was, the 5150 was only able to use off the shelf components (including the CPU) because it was considered a one-off experiment initially given a month for the design and a year to ship.


> RAM size growth would probably continue.

True. But note - very long RAM grows ~ periodically doubling one chip size, and first chips don't have controller inside, so require very short traces to bus chip or CPU.

And usually, old chip becomes for example 10% cheaper, but twice size priced ~50% more than old, and to adopt new chips you need new memory controller with additional pins.

> At the time there wasn't really any established trend of progressive growth in graphics resolution or colors

Unfortunately, only partially true.

You may hear about RAMDAC on video forums topics. It is partially palette, but also generator of video signal, reading from RAM very fast.

Problem is that first "fast page" DRAM have very slow interface, so when larger chips become available (and with cheaper kilobytes than older, this was real logic of semiconductor technology progress), speed of RAM was not grow. And unfortunately, this once become bottleneck, it limits grow pixelrate, so even with twice RAM you could not got twice resolution.

In past, I few times calculated speed of RAM need to give classic 60 FPS, and at least up to (and including) first SDRAM machines just show their screen was enough to eat significant share of main RAM throughput, so internal graphics could even affect CPU performance.

On consoles problem was not so harmful, because limited resolution of consumer TV, but on few consoles used expensive frame buffer inside graphics chip.

On modern GPUs problem of RAM throughput solved by used overclocked designed VRAM chips and with extremely wide RAM bus, so chips run in parallel - in computers typical ~64bit, but GPUs start with 128 and top models have 512 or even 1024 bits.


I once read that IBM had contacted Atari about licensing their chipset, so they did actually care about gaming to some degree.

Also a lot of Apple users gamed on a monochrome monitor, so how many colors maybe wasn't the biggest concern, just 'has some'. The resolution was largely fixed by the tube technology.


Interesting. I hadn't heard that about Atari. The odd thing is that the Atari 400/800 chipset couldn't display 80 column text, which seems to have been a 'must have' for IBM due to word processing and terminal emulation being considered essential.

I wonder if may be it was when IBM was working on the PC Jr.


Yeah, the impression I have is the talks went nowhere, but Atari was obviously on top of the market on that point, so no surprise they made a call. Maybe IBM wanted to contract something out, but IIRC Jay Miner had already quit.


Atari's Sunnyvale Research Lab (SRL), run by Alan Kay, were working on some graphics chipset at that time. Probably why IBM came knocking.


The Motorola 6845 CRTC chip is quite versatile, and one of its unique characteristics is that it knows and cares nothing about the resolution or number of colors on the screen. It is just a display address generator, which is meant to provide some external hardware with a memory address that contains data to be displayed at some part of the screen. What to do with this address and data is completely up to the computer hardware, which can interpret it whichever it wants. So there is nothing in the 6845 chip that prevents using it to display 256, 4096 or 16777216 colors on the screen.


Interesting, I didn't know that. I only assumed the products that used it were using most or all of its capabilities. Do you know if there any early 80s computers that leveraged the 6845 in impressive ways?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: