Sunday, July 22, 2012

A brief history of computers


Modern man is difficult to imagine life without electronic computers (PC). Currently, anyone, in accordance with their requests, can gather in their full desktop computer center. So it was, of course, not always. The path of humanity towards this achievement was a difficult and thorny. Many centuries ago, people wanted to have adaptations that help them to solve various problems. Many of these problems were solved some serial execution of routine activities, or as we say now, the implementation of the algorithm. In attempts to invent a device that can implement the simplest of these algorithms (addition and subtraction of numbers), it all started ...



Blaise Pascal
The starting point can be considered the beginning of the XVII century (1623), when a scientist has created a machine B. Shikard who can add and subtract numbers. But the first calculating machine, able to perform four basic arithmetic operations, adding machine was the famous French scientist and philosopher Blaise Pascal. The main element in it was a gear wheel, an invention which in itself was a key event in the history of computing. It should be noted that the evolution of computer technology has been uneven, spasmodic in nature: the periods of accumulation of forces are replaced by breakthroughs in the development, followed by a stabilization period, during which the results achieved are used in practice and at the same time accumulate knowledge and power for the next leap forward. After each round of evolution goes to a new, higher level.



Gustav Leibnitz
In 1671 the German philosopher and mathematician Leibniz Gustav also creates an adding machine, based on special gear design - gear Leibniz. Leibniz's calculating machine, like his predecessors, adding machines, perform four basic arithmetic operations. At this time this is over, and humanity for almost a century and a half kopilo strength and knowledge to the next round of the evolution of computer technology. XVIII and XIX centuries were a time when the rapidly developing various sciences, including mathematics and astronomy. They often had problems requiring lengthy and laborious calculations.



Charles Babbage
Another famous person in the history of computer technology was English mathematician Charles Babbage. In 1823 Babbage started work on the machine to compute the polynomials, but, more interestingly, this machine was supposed to, except for the direct production calculations produce results - print them out on the negative plate for printing. It was planned that the machine will be operated steam engine. Due to technical difficulties before the end of Babbage failed to realize his project. For the first time the idea of ​​using some external (peripheral) device for releasing the results of the calculations. It should be noted that another scientist, Shoyts in 1853 is still sold the car, conceived by Babbage (it turned out even less than planned). Probably more than Babbage loved the creative process of searching for new ideas than the embodiment of them into something tangible. In 1834, he outlined the principles of another machine, which has been called his "analytical." Technical difficulties prevented him from again until the end to realize their ideas. Babbage was able to bring the car just before the stage of the experiment. But that idea is the engine of technological progress. Another machine is Charles Babbage was the embodiment of the following ideas:

Manage the production process. The machine controls the operation of the loom, changing the pattern produced by tissue depending on the combination of holes on a special paper tape. This tape was the forerunner of all the familiar media as punch cards and paper tape.
The programmability. Of the machine and ran a special paper tape with holes. The order of the holes on her team determined and processed by these commands data. The machine had an arithmetic unit and memory. The teams entered the machine, even a conditional jump command that changes the course of calculations based on some intermediate results.
In the development of this car was involved Countess Ada Lovelace August, which is considered the world's first programmer.

Charles Babbage's ideas were developed and used by others. For example, in 1890, at the turn of the XX century, the American Herman Hollerith developed a machine that works with data tables (the first Excel?). The machine is programmed on punch cards. It was used in conducting the census in the U.S. in 1890. In 1896 Hollerith founded the firm, which was the forerunner of IBM. With the death of Babbage in the evolution of computer technology came another break until the 30s of XX century. In the future, the whole development of mankind has become unthinkable without computers.

In 1938, the development center briefly shifted from America to Germany, where Konrad Zuse creates a machine that operates in contrast to its predecessors, not decimal numbers and binary. This car was also still mechanical, but its undoubted advantage was that it was realized the idea of ​​data in binary code. Continuing his work, Zuse in 1941 created an electromechanical machine arithmetic unit which was performed on the basis of the relay. The machine is able to perform floating point operations.

Over the ocean, in America, during this period were also work on the creation of such electromechanical machines. In 1944, Howard Aiken designed the car, which was called Mark-1. She, like Zuse machine, worked on the relay. But due to the fact that this machine was clearly established under the influence of Babbage's work, she operated with the data in decimal form.

Naturally, because of the large proportion of the mechanical parts of these machines were doomed. We had to find a new, more technologically advanced element base. And then they remembered about the invention of Forrest, who in 1906 created a three-electrode vacuum tube, called a transistor. Because of their functional properties, it has become the most natural replacement of the relay. In 1946, the United States, at the University of Pennsylvania, created the first universal computer - ENIAC. Computer ENIAC contained 18 thousand lamps, weighed 30 tons, occupied an area of ​​approximately 200 square meters and consume much power. It is still used decimal operations and programming carried out by switching the axis of connectors and switches installed. Of course, what "programming" entailed the emergence of a number of problems, caused, first of all, incorrect installation of the switches. From ENIAC project name associated with another key figure in the history of computing - mathematician John von Neumann. It was he who first suggested to record the program and its data in computer memory so that they can be modified if necessary in the process. This key principle has been used in the future to create a fundamentally new computer EDVAC (1951). This machine has changed for binary arithmetic, and memory is used, based on the ultrasonic mercury delay lines. The memory can store 1024 words. Each word is composed of 44 bits.

John von Neumann on the background of your computer EDVAC 


After creating the EDVAC mankind has realized what the height of science and technology can be achieved in tandem human-computer. The industry began to develop very rapidly and dynamically, although here too there was a periodicity associated with the need to accumulate a certain baggage of knowledge for the next breakthrough. Until the mid-80s the evolution of computer technology can be divided into generation. For completeness, we give this brief qualitative characteristics of the generations:

The first generation of computers (1945-1954 gg.) During this period, formed a set of typical structural elements that make up the computer. By this time the developers have already formed about the same idea of ​​what elements should be a typical computer. This - the central processing unit (CPU), RAM (random access memory, or - RAM) and input-output (Peripherals). CPU, in turn, should consist of an arithmetic logic unit (ALU) and control unit (CU). Machines of this generation have worked on the tube element base, causing a huge amount of energy absorbed and were not very reliable. With their help, mainly to solve scientific problems. Applications for these machines it was possible to be not in machine language and assembly language.

The second generation of computers (1955-1964 gg.). Change of generations defined appearance of a new element base: instead of cumbersome lamps were used in the computer miniature transistors, delay lines as memory elements are replaced by magnetic core memory. This eventually led to a decrease in size, increased reliability and performance of computers. In computer architecture appeared index registers and hardware to perform floating point operations. Teams have been developed for the subroutine call.

There were high level programming languages ​​- Algol, FORTRAN, COBOL, - created the preconditions for the emergence of portable software that does not depend on the type of computer. With the advent of high-level languages ​​have compilers for them, a library of standard routines, and other familiar things to us now.

An important innovation is that it should be noted - is the emergence of the so-called input-output processors. These specialized processors are allowed to release the CPU from the input-output control and implement the input-output with the help of a dedicated device simultaneously with the computations. At this stage, dramatically expanded the range of users and computers has increased the range of tasks. For effective management of resources were used machine operating system (OS).

The third generation of computers (1965-1970 gg.). Generational change was due to re-upgrade hardware components: instead of transistors in the different nodes are computers used integrated circuits of varying degrees of integration. Chips are allowed to place dozens of items on the plate of a few centimeters. This, in turn, not only increased the performance of computers, but also reduced their size and cost. There is relatively inexpensive and compact machine - mini-computers. They were widely used to control various technological production processes in the systems for collecting and processing information.

Increasing the capacity of computers has made possible the simultaneous execution of multiple programs on one computer. You need to learn to co-ordinate with each other simultaneously performed actions, which were expanded functions of the operating system.

Along with the active development of hardware and architectural solutions is growing share of development in the field of programming technologies. At this time the active development of the theoretical fundamentals of programming, compiling, databases, operating systems, etc. It creates software packages for various areas of human activity.

It is now becoming a luxury to rewrite all of the programs with the advent of each new type of computer. There is a tendency to create families of computers, that is, machines are upward compatible to the software and hardware. The first of these families was a series of IBM System/360 and our domestic analogue of the computer - UCS.

The fourth generation of computers (1970-1984 gg.). Another change of the element base has led to a change of generations. In the 70 years has been actively working to create a large and very large scale integrated circuits (LSI and VLSI), which are allowed to place on a single chip tens of thousands of items. This resulted in a further significant reduction in the size and cost of computers. Work with the software became more user-friendly, resulting in an increase in the number of users.

In principle, when the extent of integration of the elements has become possible to try to create a functionally complete computers on a single chip. Relevant attempts have been made, although they have met, mostly incredulous smile. Perhaps those smiles would be less if it were possible to predict what this idea will cause extinction of a mainframe kakih-nibud fifteen years.

Nevertheless, in the early 70's was released by Intel microprocessor (MP) 4004. And if before the world's computers were only three areas (super computers, large computers (mainframes) and mini-computers), then now they added one more - microprocessor. In the general case, the processor understands functional unit computers, designed for logical and arithmetic processing based on the principle of microprogram control. On the hardware implementation of the processors can be divided into microprocessors (fully integrated with all the features of the processor), and processors with small and medium integration. Structurally, this is reflected in the fact that microprocessors implement all the functions of the CPU on one chip, and other types of processors implement them by connecting a large number of chips.



The Intel 4004
So, the first microprocessor 4004 was created by Intel in the late 70s. It was a 4-bit parallel computing device, and his opportunities were very limited. 4004 could produce four basic arithmetic operations and was used initially only in the pocket calculators. Later, its scope has been expanded through the use of different control systems (for example, to control the traffic lights). Company Intel, correctly foresaw the prospect of microprocessors, continued intensive development, and one of its projects eventually led to a major success, bias future path of development of computer technology.



The Intel 8080
They began a project to develop an 8-bit processor 8080 (1974). This was quite an advanced microprocessor instruction set and was able to divide numbers. It was used to create the Altair personal computer, for which a young Bill Gates wrote one of his first interpreters of the language BASIC. Perhaps it is from this point should we start counting the 5 th generation.

The fifth generation computers (1984 - today) can be called a microprocessor. Note that the fourth generation ended only in the early '80s, that is, the parents in the face of large machines and their rapidly maturing and gaining strength, "Son," For nearly 10 years have existed relatively peacefully together. For both of them this time went only benefit. Designers of large computers have gained great theoretical and practical experience, and microprocessor programmers were able to find their own, albeit initially a very narrow niche market.



Intel 8086
In 1976 the company completed the development of Intel 16-bit 8086. He had a large enough bit registers (16 bits) and the system address bus (20 bits), through which he could address up to 1MB of RAM.

In 1982 he was created 80286. This processor was an improved version of 8086. He supported several operating modes: real, when the formation of the address made by the rules of i8086, and secure, which implements the hardware multitasking and virtual memory management. 80 286 also had a great bit address bus - 24-bit vs. 20 in 8086, so he could address up to 16MB of RAM. The first computers based on this processor appeared in 1984. In terms of computing power that the computer has become comparable to the IBM System/370. Therefore, we can assume that on this fourth generation of computers was completed.



Intel 80286
In 1985 the company introduced the first Intel 32-bit microprocessor 80386-compatible hardware from the bottom up with all the previous processors from this company. He was much more powerful than their predecessors, had a 32-bit architecture and can directly address up to 4GB of RAM. 386 began to support the new mode - Mode Virtual 8086, which provided not only greater efficiency programs developed for 8086, but also to allow parallel operation of several such programs. Another important innovation - support for paging memory - possible to have a virtual memory space up to 4TB.


Intel 80386

386 was the first microprocessor, which used parallel processing. Thus, both carried out: access to memory and input-output devices, placement of commands in the queue for their decoding, conversion of linear addresses to physical addresses, as well as paging address translation (of the 32 most frequently used pages, housed in a special cache memory).



Intel 80486

Shortly after the 386 came 486. Its architecture has been further development of the idea of ​​parallel processing. The device decode and execution teams had been organized as a five-step assembly line, the second in a different stage of execution could be up to 5 teams. The crystal was placed the cache of the first level, which contains frequently used code and data. In addition, there was a cache in the second level up to 512 KB. Now you can build a multi-processor configuration. The system processor commands have been added to the team. All of these innovations, along with a significant (up to 133 MHz) CPU clock frequency increases, significantly improved the speed of the pro gram.

Since 1993, microprocessors were produced Intel Pentium. Their appearance, marred by an error in the beginning of the block floating-point operations. This error was quickly corrected, but the distrust of these microprocessors have some time left.


Intel Pentium
Intel Pentium
Pentium continued development of the ideas of parallel processing. In the decoding device, and the execution of commands has been added to the second conveyor. Now the two pipelines (called u and v) together can execute two instructions per clock cycle. The internal cache was doubled - up to 8 Kbytes of code and 8 KB for data. The processor has become more intelligent. It adds the ability to branch prediction, and therefore greatly increased the efficiency of execution of non-linear algorithms. Although the architecture of the system remained still 32-bit microprocessors were used in 128 - and 256-bit data bus. External data bus has been increased to 64 bits. Continued their development of technologies related to multiprocessor data processing.

The emergence of the microprocessor Pentium Pro has divided the market into two sectors - high-performance workstations and low-cost home computers. In the Pentium Pro processor was implemented the most advanced technology. In particular, one more was added to the pipeline available in two processor Pentium. Thus, in one cycle of the microprocessor has to perform up to three instructions.



Intel Pentium II
Moreover, the Pentium Pro processor to allow dynamic execution of commands (Dynamic Execution). Its essence is that the three teams decoding device, working in parallel, divide the team into smaller parts called microoperations. Further, these micro can execute in parallel with five units (two integer, two floating-point unit and a memory interface). The output of these statements are going back to the original form and order. The power of Pentium Pro is complemented by an improved organization of its cache. As the processor Pentium, it has 8 Kbytes of cache memory of the first level and 256 KB cache in the second level. However, due to circuit design (the use of dual independent bus architecture) cache second level positioned on a single chip with a microprocessor, which significantly increased productivity. In the Pentium Pro have implemented 36-bit address bus, which made it possible to address up to 64GB of RAM.

The process of developing a family of ordinary Pentium processors also not standing still. If the Pentium Pro processor computing parallelism has been implemented through the architectural and circuit solutions, creating the models of the Pentium processor took a different path. They included a new team to support several of which have changed the programming model of the microprocessor. These commands, called MMX-commands (MultiMedia eXtention - multimedia extension instruction sets) allowed simultaneous processing of multiple units of the same type of data.



Intel Pentium III
The following was released in light of the processor, called the Pentium II, combines all the technological advances in both directions of development of architecture Pentium. He also had the new design features, in particular, its body is made in accordance with the new technology of manufacturing buildings. Do not forget, and the market of portable computers, and therefore the processor supports multiple power saving modes.



Processor Pentium III. Traditionally, it supports all the achievements of their predecessors, most importantly (and perhaps only?) It has the advantage - the presence of 70 new commands, these commands complete the MMX-instruction group, but for the floating-point numbers. To support these teams in the architecture of the processor was turned on a special unit.














No comments:

Post a Comment