Charles Babbage and Alan Turing:...
Charles River Editors
Today, the world is in the midst of the transformative and ever-developing Digital Age, otherwise referred to as the “Age of Information.” It has been an unprecedented, remarkable, and explosive era marked by social media and computer-generated imagery (and with it, deep fakes), among other novel, previously unimaginable concepts. The bulky monitors and blocky towers of personal computers and laptops, which were once upon a time considered fashionable, futuristic contraptions, have since been replaced with a sleek and stylish array – both multi-functional and specialized – of aerodynamic, minimalistic devices, ranging from smartphones and tablets to lightweight laptops and full-fledged gaming set-ups packed with powerhouse processors.
While many are familiar with those facts, and a recent movie revived interest in Alan Turing’s achievements with computing during World War II, it was Charles Babbage who was the first to conceive the notion of a programmable and automatic universal computer, which, on top of its ability to calculate any mathematical equation at an unmatched speed, could also be used for a seemingly infinite number of other applications. In other words, he envisioned the precursor to the modern computer.
Given that he was the one who envisioned a concept so momentous that it ultimately led to the creation of what is now considered the world's first computer, many might be mistaken for thinking Alan Turing was the kind of suave, pipe-puffing dandy that many might associate with such a grand and futuristic idea. In reality, he was nothing of the sort. Turing was hardly the kind of two-dimensional, stereotypically bookish character whose light bulb suddenly went off during an experiment binge either. On the contrary, Alan was a gauche and grief-stricken 17-year-old schoolboy who would channel all the pain and confusion from his poignant heartbreak into his tireless research, paving the path for the deeply transformative Computer Age.
Show book