top of page
Writer's pictureTim Buchalka

History of the Computer Part 2 - Learn To Code Series

In the last blog in this series, I posed a question to you, did you know what computer generation we're currently in? Now, you'll find out the answer to that later in this post.




It's very useful when learning to program to understand the history of the computer. We started doing just that in the previous blog. You can click here if you haven't read that first post. So don't worry in terms of programming because we'll actually get to the coding stuff really soon.


In this blog, though, we're gonna discuss the more commonly cited advancements in the computer from the 1950s through to today and actually to the foreseeable future, as well. But starting with the 1950s, computer scientists back then, they began adopting a generational number to the technologies used in a computer, and that numbering actually continues as of today. As we go along, keep in mind that the term computer is now much broader today than it was back even as recently as the turn of the 21st century.


Today, all kinds of electronic smart devices can be classified as computers. We live in very interesting and exciting times indeed. All right, so are you ready? Let's go.


The 1950s kicked off the proliferation of the stored-program computer as we know it today and the generational labeling of computers by computer scientists based on various hardware technologies used. So in answer to the question of which computer generation we're presently in, well, we're currently in what's known as the fifth generation, and we're moving slowly towards the sixth. Let's look briefly now at what each computer generation actually included and is all about.


The first generation in the 1950 utilized vacuum tubes for processing and storage and were massive in size, consuming entire large rooms. It could take days or even weeks for people to reconfigure these massive devices in preparation for another calculation to be done. Magnetic tape became the storage device of this era.


The second generation in the late 1950s to mid-1960s used transistors and magnetic cores rather than the vacuum tubes, which reduced their physical size somewhat. Transistors supported two states. For example, on-off, yes-no, open-closed, et cetera, and are implemented as two binary digits, zero and one, often referred to as bits.


IBM invented the magnetic hard disc around 1957 because of the need for real-time data storage for accounting work used by businesses. Its first disc consisted of 50 platters with 100 writable surfaces and could store a, wait for it, whopping 3.75 megabytes of data. Wow, that was a huge amount of data in that day.


Finally, the Fortran and Cobol programming languages emerged during this decade. The third generation emerged from 1965 through 1975 and was identified using integrated circuits or ICs rather than transistors and magnetic cores. These computers became known as mini-computers because their physical size shrunk to about the size of a desk.


IBM introduced removable disc storage in 1970, and it could hold 100 megabytes of data. The floppy disc drive, FDD, was invented at IBM by Alan Shugart in 1967 and used an eight-inch floppy. Shugart went on to eventually form Seagate Technologies.


The first commercially available floppy discs for data storage could store less than one megabyte of data. Through the next 25 to 30 years, the floppy disc drive and disc shrunk to 5.25-inch and then to a hard-cased 3.5-inch disc capable of storing several million bits of data.


The fourth-generation emerged from 1975 to 1985 and was known as the microcomputer desktop computer era. It was also the beginning of the networked computer. Steve Jobs and Stephen Wozniak co-founded Apple Computer in 1976, and IBM introduced its first desktop computer, calling it a personal computer or PC, in 1981.


This was the generation that I got introduced to computers and I immediately found a passion for them, and that's actually continued to this day. The fifth-generation has continued for over 30 years now and includes massively parallel processors capable of quadrillions-- that's 10 to the power of 15-- of computations per second. Hand-held and body-wearing devices to artificial intelligence, virtual reality, and just about every type of computer we see or hear about today.


Many of these computers are now intentionally embedded within other devices, such as an automobile, lodging house, lighting systems in buildings, and even home refrigerators. Data storage capacity has risen tremendously during this time, and at the same time, its cost has significantly decreased. Memory sticks or thumb drives have virtually replaced the floppy disc. Cloud storage is quite prominent, as well, with such thanks to Dropbox, iCloud, OneDrive, and others.


The sixth-generation is slowly emerging, and it may be characterized by a few very different concepts and technologies. For example, the use of gallium arsenide chips rather than silicon wafer chips and the notion of quantum computing, which is based on quantum physics.


Quantum computers are not based on the use of binary codes, zero and ones. Binary code, though, continues to be used in today's computers, but quantum computers are not based on transistors, but rather on the use of quantum bits. So a quantum bit can have far more than two states.


A quantum bit is very different than a binary bit. A quantum bit can store far more data than a binary bit. So 100 quantum bits can theoretically contain 10 to the power of 30 individual pieces of information. More importantly, however, the way that quantum bits work make parallel processing a necessity. This means that mind-bogglingly massive calculations could conceivably be performed in a fraction of a second using only one CPU cycle.


Another technology being explored is DNA-based data storage. A single gramme of synthetic DNA can hold 215 petabytes of data. Wow! Oh my, that's a lot of data.


So researchers really do like to push the boundaries of any existing theory, concept, or actual advancement, and so it's certainly not that unusual to see computing evolve in the same way. As I've said before, we truly do live in exciting times.


All right, so we've come to the end of this blog post, but I have got a question for you first. Do you know what boolean operations are? They are foundational for a better understanding of today's computers but also programming in general. Find out what they are and what they're all about in the next post, and thanks for reading!

108 views0 comments

Comments


bottom of page