The History of Computer Science


Computer science, a field that is growing at an astonishing rate, with breakthroughs occurring everyday all across the world. For the average person, the advancements brought upon by computer science have given us: computers, cell phones, and smart home devices. All of these devices have rapidly developed as well, nowadays, a modern cell phone is even more powerful than the computers that were used to put men on the moon in the late 60’s and 70’s. As the field of computer science continues to grow, it continues to influence other industries such as construction, agriculture, education, and military.

It’s crazy to imagine that 70 years ago, computers used to be the size of an entire large room. Most technological developments occurred post-WWII, however the period before it helped to define computer science for generations to come.

Post 1900’s, humans had been using mechanical devices to aid in performing simple mathematical calculations, among these being the abacus which historians believed originated from China. Over time, inventors and mathematicians all over the world would construct mechanical machines out of wood and gears to help simplify mathematical calculations. Of this period, the most notable development was done by Ada Lovelace in 1842 for creating the first computer program, this was done on paper, but it showed what could be done, thus, Ada was dubbed the world’s first computer programmer.

WWII had brought upon a need for electronic digital computer to break the German Enigma Code and to conduct the calculations needed for ballistics. This time period would see the development of things such as: the Mark 1 electromechanical computer, a general purpose computer; Colossus, a British computer used to aid in breaking the German Enigma Code; ENIAC, a general purpose computer; and transistors, revolutionizing the way that computers were built and bringing forth the microprocessor revolution.

From the 1950’s and beyond, computer science would rapidly advance. In 1957, John Backus developed the first programming language, Fortran, which is still used today. The creation of Fortran helped programmers interact with computers in a much more efficient way, the code would be translated into binary and executed, rather than programmers having to write in binary, punch it out on cards, and feed it to the computer. Programming would see another advancement in 1969 when Niklaus Wirth created the programing language, Pascal, ushering in an era of structured and sophisticated code. This would mark the period of time where the need for programmers grew exponentially, meaning more opportunities to innovate. Only a few years later in the early 80’s, Microsoft developed MS-DOS, an operating system that allowed users to interact with their personal computers, at this period of time, the processing power of computers slowly became more and more accessible for the average person. Programs for games, business, and military began to be developed for use on PCs, personal computers.

Nowadays, everyone has power processing power in their own hands, and as computer science grows so will its impacts all across the world. We’ve only just begun to understand the potential of machine learning and artificial intelligence, and there is so much more to unlock.