Back to blog
November 12, 2024

From 1+1 in Assembly to LLMs: The Evolution of Computing Abstraction

Tracing the Layers from Machine Code to Natural Language Interfaces

From 1+1 in Assembly to LLMs: The Evolution of Computing Abstraction

Computing has come a long way since the early days of punch cards and assembly language. With each new generation of programming paradigms, we’ve added layers of abstraction that make it easier for humans to interact with machines. In this post, we’ll explore how a simple operation like 1 + 1 is handled across different programming languages and models—from assembly to modern Large Language Models (LLMs). We’ll see how each layer adds complexity under the hood but offers significant benefits to developers and users alike.

1. Assembly Language: The Birthplace of Computing

In the earliest computers, programmers wrote code directly in assembly language, which is a low-level language closely tied to the machine’s hardware. Each instruction in assembly corresponds directly to machine code, which the CPU executes.

How 1 + 1 Works in Assembly

Here’s how you might perform 1 + 1 in x86 assembly:

section .data
    result db 0       ; Reserve a byte for the result

section .text
    global _start

_start:
    mov al, 1         ; Load 1 into the AL register
    add al, 1         ; Add 1 to the value in AL
    mov [result], al  ; Store the result in memory
    ; Exit program (system-specific code omitted for brevity)

Benefits:

Drawbacks:

2. C Language: Introducing Compilation

To simplify programming, higher-level languages like C were developed. C provides a layer of abstraction over assembly, allowing developers to write more readable code that gets compiled into machine code.

How 1 + 1 Works in C

#include <stdio.h>

int main() {
    int result = 1 + 1;
    printf("1 + 1 = %d\n", result);
    return 0;
}

Benefits:

Drawbacks:

3. Java: Embracing Object-Oriented Programming

Java is one of the most widely used object-oriented programming (OOP) languages, adding another layer of abstraction. It further simplifies coding by organizing data and behavior into objects and classes.

How 1 + 1 Works in Java

public class Addition {
    public static void main(String[] args) {
        int a = 1;
        int b = 1;
        int c = a + b;
        System.out.println("1 + 1 = " + c);
    }
}

Benefits:

Drawbacks:

4. Python: The Rise of Interpreted Languages

As of 2024, Python has become one of the most popular programming languages, especially for scripting and rapid application development.

How 1 + 1 Works in Python

a = 1
b = 1
c = a + b
print(f"1 + 1 = {c}")

Benefits:

Drawbacks:

5. Large Language Models (LLMs): Conversational Computing

Now, we have reached an era where you can ask an LLM like ChatGPT to compute 1 + 1.

How 1 + 1 Works in an LLM

Benefits:

Drawbacks:

The Future of AI Beyond LLMs

While LLMs are powerful, they have limitations due to their inability to perform explicit computations or symbolic reasoning. Researchers are exploring ways to address these weaknesses:

This example, although simple, involves a lot of computation due to the extensive matrix operations in the transformer architecture. As a result, running LLMs typically requires specialized hardware with significant processing power.

The Common Denominator: Transistors and Logic Gates

Despite the increasing layers of abstraction, all these computations ultimately run on silicon-based transistors (although we might move away from silicon in the future) using logic gates to process binary data (1s and 0s).

Why Add More Abstraction Layers?

Conclusion: Bridging the Gap Between Humans and Machines

From manually coding in assembly to interacting with machines using natural language, we’ve significantly bridged the gap between humans and computers. While each layer of abstraction adds complexity beneath the surface, the benefits in usability, productivity, and accessibility are undeniable.

What a time to be alive!

As technology continues to evolve, we can expect even more intuitive ways to interact with machines, making computing accessible to an even broader audience.