Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Unfortunately, the Apple III was a business failure due to reliability issues and competition from the IBM PC introduced a year later.

Curious about these reliability issues, this prompted me to read more about the Apple III on Wikipedia and it's quite interesting:

> There was way too short a time frame in manufacturing and development. When the decision was made to announce, there were only three Apple IIIs in existence, and they were all wire-wrapped boards.

> The case of the Apple III had long since been set in concrete, so they had a certain size logic board to fit the circuits on ... They went to three different outside houses and nobody could get a layout that would fit on the board.

> They used the smallest line circuit boards that could be used. They ran about 1,000 of these boards as preproduction units to give to the dealers as demonstration units. They really didn't work ... Apple swapped out the boards. The problem was, at this point there were other problems, things like chips that didn't fit. There were a million problems that you would normally take care of when you do your preproduction and pilot run. Basically, customers were shipped the pilot run.

https://en.wikipedia.org/wiki/Apple_III



The Apple III is actually a pretty interesting machine as far as older 8-bit systems go, and it includes some unique hardware features to work around the limitations of the 6502 CPU:

- The machine supports a maximum of 512 kB of RAM.

- The 6502 zero-page can be relocated on the fly to anywhere in the 512 kB address space.

- Certain 6502 addressing modes can access the entire 512 kB address space without bank switching, where it otherwise would only be able to access 64 kB.

- The area of RAM reserved for the operating system and device drivers can be protected from regular apps.

- It also includes built-in RGB video at 280x192x16 colors, and a 6-bit DAC for sound.

But even without the initial reliability problems the machine probably didn't stand a chance in the market.

It fell in the trap of being simultaneously too much computer (too expensive for what it does) and not enough computer (not powerful enough for its rather complicated operating system). And its 8-bit architecture would be quickly eclipsed by 8086 and 68000-based machines, so it was both state of the art and outdated at the same time.

Apple was right in that they needed a machine that was significantly more powerful than the Apple II, but their timing was off and the architecture was wrong. The IBM PC did a much better job of hitting the sweet spot of price vs. power vs. complexity.


Another problem with the Apple III was that Steve Jobs didn't like cooling fans or vents so the Apple III (like the Apple II and Mac) didn't have a fan. Reportedly, the Apple III got hot enough to warp or melt floppy disks. Another problem with heating/cooling cycles was that integrated circuits would migrate out of their sockets. Apple's recommended "fix" was for customers to lift the computer a few inches and drop it, hopefully reseating the ICs.

https://www.techjunkie.com/apple-iii-drop/


It's curious to see how far back the form over function at Apple goes. Truly part of the company spirit.


And then 20 years later Steve Job repeated the same mistake again with Power Mac G4 Cube. ( No Fan )

Though ultimately the tech got good enough. Another 20 years passed the G4 Cube we now have 5nm A14 on a Smartphone or iPad Air that are powerful enough for 80-90% of consumer use cases. All without a Fan. We also have a roadmap of 3nm and 2nm ~2025.

Unfortunately Steve did not live long enough to see it.


I seem to recall hearing that Steve Jobs loved the beautiful carved wood prototype Apple /// case design* that he had commissioned and wasn't willing to change it to fit the logic board (or to move the floppy drive outside the case, etc.), instead directing the designers to "saw it in half", i.e. split it into two smaller boards, connected by a ribbon cable. Unfortunately the ribbon cable put tension on the boards, causing them to flex, which could cause the socketed RAM chips to unseat, as cited in the Wikipedia article, hence the "pick up your Apple /// and drop it" approach to re-seating the chips.

Jobs remained adamant that computers with fans were junk, so the original Macintosh also lacked a cooling fan, in spite of having an internal power supply, CRT, and supporting analog board. This presumably contributed to the large number of analog board failures in classic compact Mac models.

*which might be part of Stanford's Apple museum collection or at the Computer History Museum?


Compatibility with the Apple II was horrible too. How they didn't get that right is just beyond belief.


At the time compatibility was not considered as important as you might think today.

# Radio Shack

Model I

Model II (not compatible)

Model III (weakly compatible with Model I)

Color Computer (not compatible)

# Commodore

PET

VIC (not compatible)

64 (not compatible)

Amiga (not compatible)

# Atari

400

800 (mostly compatible)

ST (??).

# IBM

PC

PC jr (weakly compatible with Model I)

PS/2(mostly compatible)

Doing this from memory. Sure I got some this wrong but it illustrates the point that compatibility had not been show to be so important


Also, Texas Instruments released its Professional Computer, a TI PC going up against IBM PC. It ran MS-DOS and used an 8088 CPU, but it was not a clone and wasn't totally compatible. (Different graphics, different expansion bus, slightly different MS-DOS, slightly different MS-BASIC, etc.)

Byte Magazine's review (https://archive.org/details/byte-magazine-1983-12/page/n287/...) described it as "daring to be somewhat different".


You'll notice it and the DEC Rainbow didn't last long. Compatibility was important. The PC AT showed compatibility plus was important. After all it was the whole point of the turbo switch.


It's interesting to ask why people didn't understand the importance of compatibility at first.

One reason could be the tendency to disregard the cost of software, which is a mistake people still make today.

But there was also another factor back then: computer hardware was very expensive. It is understandable to think that, if you're going to spend a gigantic pile of money on computer equipment, then it needs to be the absolute best computer equipment you can get. I think wanting to squeeze the most value out of your investment could explain why buyers and manufacturers were willing to sacrifice compatibility.


Well, it was really important for Apple because the II had all the software, and what came with the III was junk. They did have an emulator but it didn't do 80 column at first.

The Atari 400/800, had a sort of compatible successor in the 1200XL which wasn't compatible enough so didn't sell well. Succeeded by the 600XL * 800XL that were actually compatible. Those were followed by the 65XE/130XE/XEGS. The ST was a parallel different line.

The IBM PC jr was just IBM being idiots with their product differentiation. Notice, it didn't do well because it sucked at compatibility.

Amiga was bought.

Compatibility was actually important when it the predecessor system was popular. Some companies didn't do it well.


Woz didn't like that there were extra chips in the design to disable the more advanced features when using Apple II compatibility mode.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: