The Incredible Evolution Of Computing And How It Has Changed Everything Around Us

The Incredible Evolution Of Computing And How It Has Changed Everything Around Us

 

The evolution of computing occurred in five phases or periods, each one separated by an invention or genius related to computers, either by hardware or software. It may not sound like much, but the history of computing hides big and fascinating secrets. Next, a summary with the background of information technology and computing where you will learn how computers evolved, as well as what technical computer systems were like from the past to the present day.

Windows Evolution:
Features, requirements, and editions of all versions of Windows

First generation (1940-1956)

The summary of the history and evolution of the computer begins in the seventeenth century.

Before what we know as the first generation of computers, marks had already been seen in the history of “analytical machines”, which differed little from mechanical calculators that were very difficult to use.

Following the contributions of intellects ahead of their time such as Blaise Pascal, Gottfried Wilhelm Leibniz, and Lady Augusta Ada King (Charles Babbage’s machine), the first computer was born.

Features of the first generation

The first-generation computers were up to 15 meters tall and weighed several tens of tons of pure steel, cable, and vacuum tubes (or pressure valves).

For what purpose was the computer created? They were built for military purposes. At this time, already well into the Second World War, there was an urgent need for technological solutions capable of optimizing ballistic calculations, improving the aim of artillery weapons, as well as new ways of encoding/decoding encrypted messages.

Not only was this equipment huge, but it also was extremely expensive to manufacture, consumed massive amounts of electricity, and generated overwhelming heat.

To program, they used punched cards and the results of the operations were printed on sheets of paper.

Iconic breakthroughs of the generation

The first to arrive was Harvard Mark I. A super calculator designed to solve ballistic problems.

However, the first officially known computer was the ENIAC (along with its successor, the ENIAC II). These computers were colossal in size and energy consumption, however, they could work up to 1,000 times faster than any other known to date.

The computer was identifiable as a single structure with a motherboard and everything integrated.

Breaking point and the development of the next generation

These historical antecedents of informatics left very clear breakpoints.

First, the large amount of space required to have one of these computers made it unfeasible to support more than one of them in the same building. Not to mention that the energy consumption was almost enough to supply a small city.

See also  How Crypto, Blockchain, and Web3 Institutions Can Accelerate Mass Adoption

Along with energy consumption, a secondary problem also came with it, which is that the heat emanating from these structures was such that no one could spend more than 1 hour in the same room.

All these details prompted several more years of work to be able to extract all the potential that the specialists saw in it. Giving way to the second generation.

Second generation (1956-1964)

The second generation was one of the biggest leaps, being known as the time of hardware architecture evolution.

Here you discover how computer science originated. The undisputed protagonists were the transistors and memory cores. This not only made computers much cheaper, but more energy efficient speaking and gave them more power, thus inaugurating the computing market.

As large corporations set their sights on this invention, patents became a new luxury tradable good.

Second generation features

In the previous generation, vacuum tubes were constantly damaged by heat and were very expensive. Tons of them were needed, too.

In this sense, transistors became a transforming element, since they were much more resistant to high temperatures, extremely cheap and greatly reduced the space required.

At the end of the 1950s, the first integrated circuits began to be incorporated, marking the step of what would become modern computer technology.

At this point, new ways of programming applications for the computer were beginning to be developed, universities were promoting computer development, and technology was rapidly moving towards desktop computers.

How did the computer evolve in this generation?

Perhaps one of the most important points of this generation fell on the developer’s Jack Kilby and Robert Noyce who, separately, developed the first versions of integrated circuits.

All thanks to the application of the theory of semiconductors. Materials capable of acting in certain conditions as conductors and others as insulators.

This allowed emulation of the behavior of a logical drive (Conducting = 1, Isolating = 0).

Integrated circuits were tiny boards (on the scale of the time) with dozens of transistors. This made it possible to greatly simplify the process of installing, building, and maintaining computers.

Chips today can have several billion tiny transistors the size of a thumb.

Breaking point and the development of the next generation

The main breaking point of this generation was the predominant existence of mechanical elements.

With the development of highly efficient electronic parts, the old solutions became less and less satisfactory. This gave way to a considerable transformation that took advantage of all the concepts already applied in the previous generation, but towards digitized components.

Other new technologies and emerging paradigms such as the operating system and programming languages ​​also represented an important step for the emergence of the digital age.

Third generation (1964-1971)

Following the timeline of computing history, integrated circuits evolved into much more compact versions, called silicon integrated circuits.

The consolidation of “Silicon Valley” and the struggle for power of brands such as Intel and IBM accelerated the pace of the evolution of computers. These became faster and more efficient. For the first time operating systems and most importantly, a programming language was seen.

Human-machine interaction levels were closer and new tools such as keyboards and printers were incorporated.

See also  What Is Data Storage And How Important Is It In Business?

Features of the third generation

Integrated microcircuits began to replace all other types of components. Even memory cores gradually began to transfer to this new technology, so commercial computers were not that far from materializing.

The possibility of slightly more complex programming allowed the development of multiprocessing algorithms, multiuser systems, and computers with more than one processor.

How was the computer evolving in this generation?

At the end of this generation, UNIX was born, the base with which most operating systems were built. Even today.

In this period, most of the big steps were made based on software and programming. The ability to multitask and process data on a large scale, combined with their small size, made computers useful in many contexts.

Here, computing had a more versatile role, achieving a merit in both digital and physical development.

The first commercial computers with a system capable of graphically managing memory were the first look at modern technology that would later set the pace for future redesigns.

Breaking point and the development of the next generation

In this generation, there was no turning point, but the practicality of this tool was approaching more and more contexts, thus demanding a higher level of development and new solutions that never stopped arriving.

It could be said that the constant pressure to invent versions of the personal computer for mass demand was what inspired many brands to take the next leap.

Fourth generation (1971-1981)

The next step in the computing timeline begins on September 15, 1971, with the commercial release of the Intel 4004 chip along with its family of components.

This is the first commercially known microprocessor and the first of the famous company dedicated to the production of semiconductors and processors. This chip had integrated within a micrometric space a large number of control components.

From a small device, it was possible to connect the memories of the equipment, control inputs or outputs, and execute the central processing functions.

This eventually gave rise to commercial computers of more compact size, enough for the concept of the personal computer to be brought to market.

Features of the fourth generation

With slightly more advanced operating systems, home computers finally became viable.

They were much easier and cheaper to produce. At the same time, they were more efficient than any other alternative brought before, with lower energy consumption and easy maintenance.

The parts were no longer integrated into the main board but rather migrated to a more standard of change and update. This allowed users to make adjustments on their own.

How have computers evolved to this point?

Apart from the Intel 4004 chip, other names can be mentioned such as the IBM 5150 launched in 1981, being the first home computer to hit the market.

A short time later the Macintosh was born (although it is recognized as the 5th generation).

Being such powerful and advanced equipment (contextually speaking for the time) the first attempts at the connection between computers were made with tremendous success. As the network grew, what is now recognized as the internet was born.

From here begins the process of evolution of computer networks.

Breaking point and the development of the next generation

The main breaking point of this generation, as happened with the previous one, is precisely the advancement of technology.

Here we begin to see a process of obsolescence, with teams that are replaced by much more frequent geniuses than the previous ones. But, given the large number of resources needed to develop these new technologies, little by little the computer specialist companies were generating intellectual monopolies.

See also  10 essential Business Intelligence tools

This would be the case until the invention of commercial programming and the beginning of the history of software evolution for the next generation.

Fifth generation (Since 1981)

The fifth generation of computing is the one that is currently taking place and is considered the generation of artificial intelligence.

It all started with the development of the MS-DOS operating system for Windows, which allowed a great management capacity and an outstanding facility to incorporate new programmable content. Computers became increasingly indispensable for work and everyday life.

E-mails are born at a commercial level (which already existed since the previous generation) and network services become widespread.

Features of the fifth generation

In this generation, every few years operating systems made huge leaps. Intel microprocessors were powerful, enough to support real multiprocessing tasks.

The devices were getting smaller and smaller and modern computers were born, distributed on a commercial level.

The history and evolution of computer networks were intimately mixed with computers.

New programming languages ​​allowed the development of entire industries based on software, providing computer solutions to all kinds of strategic areas. Little by little, the bases for the incorporation of artificial intelligence were established.

Automated robots, jointly operated machines, and software capable of learning were applied at an industrial level in a more natural and daily way.

How has the computer evolved in this generation?

The first smartphones and the first supercomputers are two of the most outstanding inventions of this generation under construction.

The simple fact of being able to create neural networks capable of generating random results and learning from the application of these “thoughts” is simply fascinating, being the symbol of the development of modern computing.

Other great advances such as Internet search engines and the application of Big Data (more recently) also strongly marked this generation.

The possible breaking point of the fifth generation

Three elements could mean a definitive breaking point for this generation that has been the longest so far.

The first, which is a recycled problem from past generations, is that the heating and electrical consumption of the equipment is increasing, so they are becoming less and less sustainable.

Second, the ethical dilemma presented by artificial intelligence and the constant threat to user privacy, processes such as massive behavior analysis or Big Data applied to everyday environments and social networks make this evident.

Third, but not least, the inability to develop more efficient processes than the current ones, encourages the creation of a new architecture that breaks with the binary paradigm known until now. The so-called quantum computers.

A resolution to any of these three questions can trigger the events that lead to the sixth generation. Opening the way to a new stage in the timeline of the evolution of computing.

Once this review of the history of computing has been given, it is much easier to notice how small geniuses and ideas gain strength over time to create complex structures with which history can be turned upside down, in this case, the evolution of informatics the central axis. So here concludes the summary of what the computer is.

BUZZBONGO

BUZZBONGO  we are here to serve society through a virtual environment that enables people who wish to develop their personal and professional skills in fields related to finance ,administration, business and the economy to share and acquire knowledge.