News
In part 1 we looked at some of the bad decisions of the eight-bit era, and in part 2 the more expensive mistakes of the 16-bit era. The 32-bit era was a little different.
The original x86 architecture emerged in 1978 with the 16-bit 8086 processor. As computing demands grew over the decades, Intel added new capabilities to handle 32-bit and 64-bit computing.
The 16-bit computer design example is especially worth a look. Want a more complex example? Here’s a blazing-fast 8080 CPU built with bitslice.
Do you remember the jump from 8-bit to 16-bit computing in the 1980s, and the jump from 16-bit to 32-bit platforms in the 1990s? Well, here we go again. We double up again, this time leaping from ...
The whole thing reminds us of a 16-bit computer like the PDP-11 where everything is a two-byte word. There are only 4K bytes of memory (so 2K words).
The computer would have to go first to the word address, then scan within it. In a 16-bit word environment, the trade-off was minor and offered advantages to the hardware manufacturers.
The move from 32 bits to 64 bits won't be easy -- and without backward-compatibility, you're really stuck I remember when Windows moved from 16-bit computing to 32 bits. I had one client who kept ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results