50 years ago, in the age of tedious core-rope memory infra and hand-written programming, a 1 MHz microprocessor with a fixed memory (ROM) of 36K words and erasable memory (RAM) of 2K words (1 word = 16 bits), driven by 70 Watts of power supplied at 28 Volts DC, landed man on the moon for the first time.

We’re talking, of course, about the Apollo Guidance Computer (AGC), of which there were two, plus an emergency backup, which were used to calculate the lunar descent. State-of-the-art in 1969, the fact is that the AGC was technically, in NASA’s own words, “insufficient".

The world watched in awe then – how could something so monumental be achieved with so little? As an aspiring student of computer science, I find it mind-boggling that I am using massively more computing power to write this article than NASA had available in its entire range of technology. Gigahertz, Gigabytes, Wi-Fi, SSD and solid-state RAM were figments of science fiction at the time. And yet, the Apollo team were able to transform science fiction into fact.

The AGC would struggle to compete with one of the cheapest computers today, the $35 minimum spec Raspberry Pi 4 with its 64bit 1.5GHz CPU and 1 GB RAM. Even a basic wearable Samsung Galaxy Fit, with a speed of 16 MHz, surpasses the AGC’s computing power a thousand times over.

In creating the AGC’s dedicated microcomputer system for the Apollo missions, designers at MIT, along with builders at Raytheon, delivered revolutionary innovations which have not only sustained the test of time but form the basis on which modern computing stands.

Client Server computing

Client server computing underpins our use of internet technology. Whether we use our personal devices (clients) to browse the web, stream live media or find our way around with satellite navigation, all the hard work is, in fact, done by powerful servers in the background, rapidly sifting through vast amounts of data and computations to serve us the precise information we need on our client devices.

So, though our sleek hand-helds are touted as ‘smart’ and all-powerful, there is no way so much data and computing power can be loaded onto something portable (yet). The Apollo programme utilised ground-based IBM mainframe computers (servers) to handle the complex navigational calculations with vast arrays of data banks, serving up the results to the AGCs (the client) in real-time. This eliminated the need for huge amounts of data and computer equipment to be flown to the moon and back, and allowed for the creation of the compact yet effective AGC.

Embedded computing

Was the AGC the world’s first satnav (satellite navigation) too? While the AGC relied on ground-based navigational servers for most of the mission, there were times when it had to navigate independently, using local data and radar. This was especially important during the lunar descent/ascent, when communications with Earth would be too slow due to a 1.5 second transmission delay, or while orbiting around the far side of the moon where communication was impossible. This requirement for the AGC to operate independently, and while in motion, makes it the first embedded portable computer system of its kind, a breakaway step from the large on-premise mainframes of that era, and an ancient ancestor of many embedded portable devices we use today. The chipset in your smartphone, for instance.

Cross Platform Integration

In integrating two vastly different remote client and server computers (the AGC and the IBM mainframe), a foundational step in cross-platform integration was established. It remains relevant even today. This principle is core to many modern applications ranging from email, social media, Wi-Fi, Bluetooth, cloud storage, digital wallets, online banking/payments, crypto-currency and IOT, among a host of other daily tech we don’t give much thought to but rely on every day.

Real-time computing

The Apollo programme was the first to use software to perform real time decision-making. Prior to this, computing was largely executed by mainframes in batch processing mode, where data is submitted for processing in batches and the results are delivered later, usually as a report. Today, no one likes to wait and real-time is the norm. Satnav (again), online music/video, gaming, Virtual Reality, Augmented Reality… anything computed digitally in real-time today, has its roots firmly in the Apollo programme innovations of the 60s. So, no, Pokémon did not just get picked off your real city streets – they are a legacy of Space tech.

Human Computer Interaction

Telling a machine what to do or change how it behaves while running, in real-time, began with the AGC’s Display and Keyboard unit (DSKY). For the first time, real-time numericals were shown on-screen, with changes allowed through keystrokes. This was a fundamental shift in how people interacted with machines.


The ability to process multiple programs at once is now commonplace in all modern computers. While newer sophisticated features have been developed to improve multitasking capabilities, ultimately, the seminal work which laid the foundation for this approach stems from MIT’s resource-sharing design for the AGC.

Processor Prioritisation

Once you allow a computer’s CPU to handle multiple programs at once, the issue of priority becomes pertinent. The Apollo designers had to code the AGC’s programs to ensure that the most vital programs were executed reliably and without delay, even if it meant that less critical functions had to wait. This feature proved vital in the final descent stage of the lunar-landing and is fundamental to all modern computer systems.

From what we know, just prior to the final touchdown on the lunar surface, the AGC was flooded with error signals from the on-board radar and became non-responsive due to overload. As this happened during the final descent, it was crucial that the astronauts and ground team worked together to rapidly overcome this crisis. While it was initially feared that the AGC had failed, it soon became apparent that an innovative aspect of the design had been invoked. Processor prioritisation saved the day by ensuring that only the most vital tasks (here, navigational descent control) continued to be executed by the AGC, cutting out the less important tasks.

While the AGC obviously did an amazing job – taking the astronauts safely to the moon and back – it also became the basis for a lot of the computing we do and use today. Interestingly, the AGC even had one quirk in common with the most sophisticated self-driving vehicles and automatic navigation systems of now. The quirk was intercepted by Neil Armstrong when he found that the system was planning to land in a boulder-strewn crater (akin to your satnav selecting a sub-optimal route when you are in a hurry). Armstrong manually piloted the craft to a safer location, completing the resounding triumph of technology. And while we still enjoy the benefits of what was created then, real heroes, for the foreseeable future, remain human, rather than digital.

Crish Chengappa is a Class XII student from Bangalore and founder of Let’s Heartify – an awareness portal to help alleviate Congenital Heart Disease. He aspires to pursue a career in computer science and AI.