Computer Generations
Technology is just growing and growing, and when you think about the time that humans have been around, it’s really happened in such a minuscule time period. Experts say that there are five generations, which means there have been five-time periods during which computer science has taken a big leap in its technological development. The first generation began in the 1940s and they go all the way up through today.
Computer Generation | From Year | To Year |
---|---|---|
First Generation | 1940 | 1956 |
Second Generation | 1956 | 1963 |
Third Generation | 1964 | 1971 |
Fourth Generation | 1971 | 2010 |
Fifth Generation | Present Day |
First Generation: [vacuum tubes] (1940-1956)
Everything started with vacuum tubes. These were widely used in the first computer systems for circuitry, while magnetic drums were used for memory. as you’re most likely aware, these first computers were huge, and would quite often take up an entire room. not only this, they were expensive to run, used a lot of electricity, and were limited to what they could do – they certainly couldn’t multitask, these machines were just giant calculators.
Second Generation: [Transistors] (1956-1963)
The world would see transistors replace vacuum tubes in the second generation of computers. The transistor was invented at Bell Labs in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient, and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
From Binary to Assembly
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
Third Generation: [integrated circuit] (1964-1971)
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.
Fourth Generation: [microprocessor] (1971-2010)
In the fourth generation of computers, the invention of the microprocessor (commonly known as CPU) helped to get computers to the desk and, later, lap-size that we still know and use today. In 1981, we saw the first home computers, brought to us by IBM, and in 1984, the first Apple Macintosh was introduced. Over time these small computers became more powerful and, before long, the Internet was developed.
Not only do we have monitors and keyboards, at this time, but also mice and, eventually, handheld devices like cell phones.
Fifth Generation: [Artificial technology] (Present Day)
The biggest thing to date is the introduction of artificial technology (AI) and features such as Apple’s Siri or Amazon’s Alexa. AI is constantly adapting and, moving forward, is expected to become more tailored towards individual business needs. The hope, as this generation progresses, is that computers can begin to learn self-organization, which sounds pretty appealing if organization isn’t something that comes naturally to you !