W1021 Computer History

From Coder Merlin
Revision as of 11:20, 26 February 2022 by Tatsat-jha (talk | contribs) (→‎The Modern Concept: add markup)
Within these castle walls be forged Mavens of Computer Science ...
— Merlin, The Coder

Introduction[edit]

Computers have undergone a long and rich history with several variations and developments throughout time. It's important to understand this history as a developer as computational devices continue to grow and evolve. This article will provide a cursory view of the development of modern computers as well as a glimpse into the future of computation.

Origins[edit]

Many analog devices were created in early human history from a need to keep track of growing numbers of people and property. The Abacus is an example of such a device. It was used as a computational aid for the people who operated them, known as computers. A standard Abacus has four rungs with nine beads in each rung. The rungs represented 1s, 10s, 100s, and 1000s. This allowed one to easily perform basic mathematical calculations and keep track of growing early civilization.

In fact, as human civilization expanded, their were several technological advancements to aid exploration and computation. Take for example the Astrolabe, an early computational device used in sea navigation through basic astronomical computation. Several of these sorts of early machines were created to perform growingly more complex calculations, tracking time, position, etc.

Military Applications were above all common for calculations given the dozen variables needed to accurately shoot artillery. In the past, hand operated calculating machines would be used to pre-calculate tables of values with several variables. These tables would then be used for a specific application. The issue with this was the room for error with pre-computed tables as well as the specificity they required: a new cannon required a different set of pre-computed values.

This led to the conceptual inventions of Charles Babbage often deemed the father of computation. Specifically, he thought of two machines: The Difference Engine and The Analytical Engine. The Difference Engine was meant to compute mathematical tables via polynomial functions. Following this, Babbage struck upon the idea for The Analytical Engine, meant to be a general computational device that could gain input from punch cards and provide output through a printer, bells, and curve plotters. Neither were completed during Babbage's lifetime but would provide the basis for computation in the future.

The Modern Concept[edit]

Around the turn of the century, America had a significant issue. The booming population made the computation of census data next to impossible. It would take so long, in fact, that by the time the data was computed, it would already be outdated. German-American statistician and inventor Herman Hollerith would solve this with the invention of an electro-mechanical tabulating machine, The Punch Card Tabulator, which could keep track of the massively growing population with punch cards. This invention came from several works using the punch card as the basis for input such as Babbage's Difference Engine.

The US Census saved millions using these machines and cut down the time to compute to years not decades in the 1890 Census. Herman Hollerith would use his invention the create the Computing Tabulating Record Company (CTR) which would be renamed to the International Business Machines Corporation, famously known as IBM.

From here, computers would evolve greatly and perhaps most notably with the famous Mathematician Alan Turing. Turing is known for the concept of Turing Complete as well as the Turing Machine, which was a mathematical model of an abstract machine that would operate by manipulating symbols on a strip of tape (The BrainF*ck Programming Language is very similar to the original Turing Machine). Alan Turing created the Turing Machine during his work on the entscheidungsproblem, saying that the Turing Machine could compute any mathematical algorithm it was given.

This would spawn the basis of programming languages and whether a language is Turing Complete, essentially meaning if it could do everything a Turing Machine could.

The first full programmable computer was said to have been created by German Engineer Conrad Zuse. The major concept of his machine, called the Z3, was the usage of a binary system, not the decimal system that was used since the time of the abacus. This came from the fact that electrical wires could be best represented as on or off, true or false, 0 or 1. Several electrical wires could be used together to create logic gates and represent all sorts of control flow, memory allocation, and be truly programmable.

A look into the future[edit]

Computers have come a long way from the human-driven calculations and even from the Z3. Semiconductors and microchip technology allow computational speeds that could have only dreamt of in the past. However, it seems the next evolution of computers will come in the form of Quantum Computing.

This concept can get very complex very quickly but just know that Quantum Computers are using laws of Quantum Mechanics such as Superposition and Entanglement to perform computations that traditional computers can not. This sector is rapidly emerging and certainly one to watch in the years to come for developers everywhere.

Overall, the major trends in Computational History come from a need to keep track of people and property as well as being able to quickly compute progressively more involved, in-depth, and challenging computational problems. From this, it's easy to understand how the applications of computers have grown and evolved over time as well as how Computer Science is so entwined with the fields of Electrical Engineering, Mathematics, and now Quantum Mechanics.

Key Concepts[edit]

References[edit]