The Computer Chronicles – Memory Management (1990)

Today we take for granted just how Memory (in a sense, raw working space) systems are being shipped with, or how much is available to a custom machine builder. 8, 16, 32, even 64 gigabytes of RAM are easily available in most any system you could buy or build today. Just a decade ago, 4 gigabytes was considered high. 2 decades ago, 1 gigabyte was a dream, and 3 decades ago, one megabyte was considered pretty decent.

This episode of the Computer Chronicles takes us back to 1990, the PC industry was in an interesting spot. Limitations built into the Intel 8086 processor were holding the computer industry back at what has famously been known as the “640K barrier” — a limit built into the 8086, an later processors, which only gave programs 640K of memory to work with. Period. While later Intel 286 and 386 processors were capable of addressing more, the general design of software, and compatibility requirements, meant most software was running the processor more like a faster 8086, not taking advantage of the new processors capabilities.

It was in this time frame that a multitude of companies came out with a variety of techniques to combat this issue. Some of them make perfect sense and are simple, others are incredibly complex and odd, yet were quite functional.

Of course, alternatives DOS / Windows on x86 hardware existed. You could use a Macintosh, which had it’s own memory issues more akin to what we’re used to today (with similar ways to solve those issues) or, at the time, OS/2 would have been a good option, given it was designed to operate on the more modern processors of the day, not held back like DOS and Windows were to their own design legacies.

The “640K Barrier” is an oddly interesting part of computer history, to me, and defeating this barrier is the subject of this episode of The Computer Chronicles.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.