Tales of a 20th-Century Software Engineer

I have been in the software engineering business for over 40 years, and I was never as happy as when I was programming—that is, writing code. I think this sentiment is shared by many of us who have had the fortune of climbing through the various roles in this industry.

I started in 1986, leaving behind the days of punch cards and tapes. Back then, we still wrote code on paper, but the white lab coats were already a thing of the past.

It was a time when an error was just that—an error—and people would waste no time reminding you of it. It wasn’t a “bug,” as we now accept as part of a program’s maturation process, as though programs needed to bake under the sun to finish developing.

Nowadays, our industry is excessively tolerant of errors, and this has worsened as programming languages and environments have evolved. Tools like “copy/paste,” “debug,” “autocomplete,” “IntelliSense,” and AI have fostered a culture of building software that assumes errors, involves little thought, relies on trial and error, abandons original algorithmic thinking, and blindly trusts code reuse.

For me, one of the thrills of programming was achieving zero errors on the first try, like a “hole in one.” It was admittedly a myth, but striving for it gave me immense satisfaction. Programming demanded such focus that everything else around me would vanish for hours, days, even nights. If, upon running the program, the result wasn’t as expected, it was already a failure—because it should have worked on the first attempt.

The other pleasure tied to programming was diagnosing issues. Back in the day, you had to channel your inner Dr. House. From certain facts and hypotheses, we had to identify the cause of a problem and correct it. Let me remind younger programmers: there was no “debug” option or remote control tools. Instead, we inserted “watchpoints” in the code to check variable contents, always being mindful of memory constraints.

Admitting an error was embarrassing because it painted a target on your back. There was no place to hide.

Revising a version and getting it to clients was anything but immediate. Forget about a simple “push” to the master branch. Creating an installation package and delivering it to clients could take a week.

This was before the internet. Let me tell you, young programmers: there were IT professionals before the internet—no GitHub, no open source, no AI, no email.

You might wonder how we delivered software to clients in the 20th century. Well, we prepared a master image for floppy disks. Some programs required up to five 5¼-inch or 3½-inch floppies. Before my time, people used 8-inch disks. Eventually, duplicating machines allowed us to automate copies, but before that, it was all manual. We prepared installation instructions with envelopes and labels for each client. In our case, we had around 1,000 clients per application.

We coordinated with the postal service for deliveries. If you really messed up, you’d pay for urgent delivery, which could take two days across the country.

Now, imagine this: you make a mistake in your algorithm, and by the time it’s detected—perhaps by a client in the Canary Islands—you’ve already spent two days duplicating 3,000 disks, paid for all the shipping (in pesetas), and have no quick way to fix it.

Every bug and hotfix burned itself into your memory, especially if it was your fault the first time. Repeat mistakes, and you’d be shown the door.

I belong to the generation that survived Y2K! Enough said—everything had to work perfectly as the century turned. There was no waiting for the next century to improve.

Today, a software engineer told me about their typical tasks. They receive a PDF outlining an API. Instead of typing, they pass it to AI to parse the document and generate a DTO. If they find a similar class but aren’t sure it’s identical, they ask AI to compare and fill in any missing fields. With such tools, tasks that used to take an hour now take five minutes. But we must ask: is this acceleration inversely proportional to robustness?

Back then, we had many operating systems: OASIS, THEOS, MS-DOS, XENIX, UNIX ATT, UNIX SCO, AIX… Some application developers created interpreted languages with platform-specific runtimes to minimize programming and testing costs. Frameworks were invented for this purpose: to standardize development so anyone could maintain the code. It’s not so different from today’s apps, web apps, and cross-platform frameworks, though even this might be outdated by the time I finish writing.

I’m not sure if today’s young programmers enjoy their work as much as I did.

We developed compilers, cross-platform runtimes, and code generators. We created 4GL languages and were pioneers of frameworks. We transitioned from paper-based coding to UML designs and static class diagrams. We invented design patterns and witnessed shifts from server applications to desktop applications, client-server setups, cloud computing, and mobile development.

In ways I can’t fully comprehend, we also taught AI how to replace you all someday soon.