Successful Programmers

Can You name some well-known developers who helped where computer systems are today?

Visualwebz
8 min readJan 2, 2024

When you hear programming or coding, what is the first thing that pops into your mind? Is it an assortment of zeros and ones on a computer screen? Or hackers scrambling to break into software systems? And what about the fascinating realm of artificial intelligence? Whatever mental images these words conjure, one fact is undeniable. Programming has come a long way, continuously evolving and reshaping our technological landscape. Let’s delve into the influential contributions of five key programmers: Ada Lovelace, Tim Berners Lee, Steve Jobs, Alan Turing, and Mark Zuckerberg. Each has paved the way for the programming landscape we recognize today.

Ada Lovelace

Augusta Ada King, Countess of Lovelace, or just Ada Lovelace, is considered one of the first programmers, if not the first, as she recognized that her friend Charles Babbage’s Analytical Engine had the potential to do more than just mathematical computations. She was born on December 10, 1815, in London, England, and died on November 27 of uterine cancer. Her mother, Anne Isabella Byron, ensured Ada had a logical scientific upbringing to avoid her father, Lord Byron, 's negative' poetic' influence. This upbringing is how she met and became friends with Babbage when she was only 17.

Charles Babbage’s Analytical Engine (Wikipedia)

There’s debate about whether to consider Ada Lovelace as the first programmer. She translated a French article on the Analytical Engine from French and added her notes and thoughts. These notes were three times the length of the original article by Luigi Menabrea and looked into the future as to what programming could be like. Her notes and thinking are what make her considered one of the first programmers. This thinking past limitations, like a computer, is still present in programming today and makes up the programmer mindset.

To get into more specifics, Lovelace’s Note G is considered the first algorithm explicitly made for a computer. This algorithm was made to calculate Bernoulli numbers using the Analytical Engine. Note G is the last of her notes labeled A-G as a suggestion from Babbage. In some of her earlier notes, she ponders the ability of Babbage’s machine to deal with things relating to music, language, and graphics. Connecting to the programmer’s mindset, Lovelace wrote in her notes her belief that the machine can only be engineered to do what we tell it to do and that that’s limited by what we know how to do. This relates to the ‘thinking like a computer’ element of the programmer mindset that’s still present today.

Tim Berners-Lee

Have you ever needed recipes for your favorite dinner meal online? Your favorite search engines, such as Google, Bing, or Edge, may work to display the recipe. However, they all have one thing in common: they pull information from the World Wide Web.

Picture from an article capturing the invention of the World Wide Web by Tim Berners Lee

Tim Berners-Lee, the inventor of the World Wide Web, is best known for his creation. His proposal in 1989 laid the groundwork for the web, an information management system. By 1993, it became available to the public. Born June 8, 1955, Berners-Lee had computing come naturally to him while he was small and graduated from Oxford University. After graduation, he took on various roles as a software engineer and worked at CERN. There, he developed a hypertext system, allowing users to navigate easily between different sections by clicking on hyperlinks.

This system later led to the invention of the web, a way to store the data of the hypertext system. The Word Wide Web revolutionized communication and information access on a large scale, shaping many of the services we use online today through the web that Berners-Lee created. Some of these services include messaging, video, and article-sharing platforms for anyone to use. The internet has influenced how regular citizens communicate with each other, but it has dramatically increased countries’ economic development and transformed how

political campaigns are run. Today's E-commerce, information, and countless other parts of the physical and digital world run through what Berners-Lee created.

Over 30 years later, in 2021, Berners-Lee put 10,000 lines of the original web’s source code up for sale online as an NFT (non-fungible token). He compared the NFT to a signed poster. The picture below represents a portion of the 10,000 lines of code that went for sale. These lines of code created the first browser ever.

An auction where Sir Tim Berners-Lee’s code was sold featured a video scrolling through the lines of code.

Geoffrey Hinton

Imagine a world where your smartphone can understand your voice commands but also anticipate your needs, your camera can easily recognize faces, and your email application suggests the following words before you can even type them.

Although some of these inventions already exist, Geoffrey Hinton, the godfather of artificial intelligence, is the mastermind behind these advancements.

Picture from an article capturing Hinton’s speech at the Thomas Reuters Financial and Risk Summit in Toronto, Canada.

Hinton gained fame for his groundbreaking work in machine learning and neural networks, notably in deep learning, a machine learning subfield meant for problem-solving. This deep learning is used in Artificial Intelligence (AI), which is the development of computer systems that can perform tasks that typically require human intelligence. AI can achieve speech recognition, image creation, and language understanding.

Currently, AI is being developed to perform more advanced tasks, coming from Hinton’s foundational neural research. His neural research delves into autonomous brain structures and examines mental imagery, all to find how neural networks can be used for machine learning, symbol processing, and memory processing. An example of Hinton’s research is the AlexNet, designed in 2012 for visual object recognition. Hinton’s work allows an AI to learn independently without a human teacher acting as its brain. The impact he had on our society and coding today is immense. He laid the foundation for AI applications, such as autonomous driving, that use a development model similar to Hinton’s AlexNet to recognize pedestrians, vehicles, and other objects on the road. Hinton’s research can be seen in this example of code used in a self-driving car.

Example of code written for a self-driven car in a car simulator

Above is an example code people have used to formulate the data taken from images and then turn it into a self-driving vehicle. Here is a visualization of what a self-driving car might recognize using its camera.

Example of a self-driving car’s recognition features.

Alan Turing

Alan Turing might be vaguely familiar with his name, but not many people know him today. Alan pioneered artificial intelligence and computer design and was a profound mathematician, leading to many technological advancements. But it is to be noted that he was also deep in cryptanalysis, philosophy, and biology. Alan was born June 23, 1912, and later died of cyanide poisoning on June 7, 1954. Although his death was ruled to be cyanide poisoning, many historians are pretty skeptical of Turing’s death.

One of his later theories was that the human brain was akin to a digital computing machine. He proposed that the cortex and birth are “unorganized machines” and that further “training” becomes organized “into a universal machine or something like it.” This ultimately led to the Turing test, a criterion to see if artificial computers are thinking. Last year, in 2022, we saw the Turing test in action via the rise of ChatGPT, one of the many AI applications made public in the past few years. This then sparked the debate on whether Turing’s test had been met. Later, it was confirmed that OpenAI’s ChatGPT passed the Turing test. ChatGPT is indistinguishable from regular people unless they are specifically looking for it.

As an example of what ChatGPT can do for new Python programmers, I have asked ChatGPT to write code to give the user fun facts about Alan Turing. Here is what ChatGPT had to say:

Alan Turing: Crash Course Computer Science #15 — CrashCourse

Mark Zuckerburg

Chip Somodevilla Getty Images

Most people these days have heard the name Mark Zuckerberg or are, at the very least, familiar with his app, Facebook. Zuckerburg was born on May 14, 1984, and was raised in White Plains, New York. He began to code at the age of 10 and was a mixture of self-taught and instructed by his father. When asked why he wanted to learn how to code, Mark said,

“It started because I wanted to do one simple thing: make something that was fun for myself and my sisters.”

He famously dropped out of Harvard in 2004 to devote his time to Facebook. But he was not the only one who started Facebook; it was also created by Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes.

But before Facebook was known as Facebook, Mark and his team only set out to create a program to connect Harvard students in a social graph that emphasized real connections. At the time, using your real information online was only starting to become a thing, and Facebook was essential in forming the online space as we now know it today.

Nowadays, Facebook is one of the foremost social media apps, with its CEO’s net worth estimated at over 113.36 billion dollars (about $350 per person in the US). Recently, Facebook’s parent company has been renamed Meta Platforms and

has shifted into augmented reality development through their Meta Quest series. Augmented reality, as it is also known, is created by generating 3D images and video to add depth and authenticity by recreating the scale and distance from 2D images.

The Incredible Evolution Of Virtual Reality — TheGamer

Takeaway

Zuckerburg’s Meta Quest series is just one of many innovations programming can lead to. Not to mention the other contributions of programmers such as Ada Lovelace, Tim-Burners Lee, Geoffrey Hinton, and Alan Turing. These visionaries weren’t just mere programmers but geniuses who saw beyond the limitations of their times. Lovelace’s clever thinking with the analytical engine, Berner-Lee's vision of data management and connectivity through the web, Hinton's research and design of machine learning algorithms, and Turing's creation of foundational principles for artificial intelligence. All their roles together have left a lasting mark on the essence of programming.

Society today is standing at the crossroads of programming in the present and the future. Because of this, it is crucial to acknowledge these programmers' impact on our digital landscape today. Thinking about the programmers of the past and their ways of thinking can help us become better programmers of today. As we honor their achievements, we must also embrace the exciting opportunities that lie ahead, thanks to the ongoing narrative they initiated.

--

--

Visualwebz
Visualwebz

Written by Visualwebz

A Seattle web design and online marketing agency that delivers high-end websites. A passion for web development and SEO.

No responses yet