Modern man is difficult to imagine life without electronic computers (PC). Currently, anyone, in accordance with their requests, can gather in their full desktop computer center. So it was, of course, not always. The path of humanity towards this achievement was a difficult and thorny. Many centuries ago, people wanted to have adaptations that help them to solve various problems. Many of these problems were solved some serial execution of routine activities, or as we say now, the implementation of the algorithm. In attempts to invent a device that can implement the simplest of these algorithms (addition and subtraction of numbers), it all started ...
![]() |
Blaise Pascal |
![]() |
Gustav Leibnitz |
![]() |
Charles Babbage |
Manage the production process. The machine controls the operation of the loom, changing the pattern produced by tissue depending on the combination of holes on a special paper tape. This tape was the forerunner of all the familiar media as punch cards and paper tape.
The programmability. Of the machine and ran a special paper tape with holes. The order of the holes on her team determined and processed by these commands data. The machine had an arithmetic unit and memory. The teams entered the machine, even a conditional jump command that changes the course of calculations based on some intermediate results.
In the development of this car was involved Countess Ada Lovelace August, which is considered the world's first programmer.
Charles Babbage's ideas were developed and used by others. For example, in 1890, at the turn of the XX century, the American Herman Hollerith developed a machine that works with data tables (the first Excel?). The machine is programmed on punch cards. It was used in conducting the census in the U.S. in 1890. In 1896 Hollerith founded the firm, which was the forerunner of IBM. With the death of Babbage in the evolution of computer technology came another break until the 30s of XX century. In the future, the whole development of mankind has become unthinkable without computers.
In 1938, the development center briefly shifted from America to Germany, where Konrad Zuse creates a machine that operates in contrast to its predecessors, not decimal numbers and binary. This car was also still mechanical, but its undoubted advantage was that it was realized the idea of data in binary code. Continuing his work, Zuse in 1941 created an electromechanical machine arithmetic unit which was performed on the basis of the relay. The machine is able to perform floating point operations.
Over the ocean, in America, during this period were also work on the creation of such electromechanical machines. In 1944, Howard Aiken designed the car, which was called Mark-1. She, like Zuse machine, worked on the relay. But due to the fact that this machine was clearly established under the influence of Babbage's work, she operated with the data in decimal form.
Naturally, because of the large proportion of the mechanical parts of these machines were doomed. We had to find a new, more technologically advanced element base. And then they remembered about the invention of Forrest, who in 1906 created a three-electrode vacuum tube, called a transistor. Because of their functional properties, it has become the most natural replacement of the relay. In 1946, the United States, at the University of Pennsylvania, created the first universal computer - ENIAC. Computer ENIAC contained 18 thousand lamps, weighed 30 tons, occupied an area of approximately 200 square meters and consume much power. It is still used decimal operations and programming carried out by switching the axis of connectors and switches installed. Of course, what "programming" entailed the emergence of a number of problems, caused, first of all, incorrect installation of the switches. From ENIAC project name associated with another key figure in the history of computing - mathematician John von Neumann. It was he who first suggested to record the program and its data in computer memory so that they can be modified if necessary in the process. This key principle has been used in the future to create a fundamentally new computer EDVAC (1951). This machine has changed for binary arithmetic, and memory is used, based on the ultrasonic mercury delay lines. The memory can store 1024 words. Each word is composed of 44 bits.
![]() |
John von Neumann on the background of your computer EDVAC |
After creating the EDVAC mankind has realized what the height of science and technology can be achieved in tandem human-computer. The industry began to develop very rapidly and dynamically, although here too there was a periodicity associated with the need to accumulate a certain baggage of knowledge for the next breakthrough. Until the mid-80s the evolution of computer technology can be divided into generation. For completeness, we give this brief qualitative characteristics of the generations:
The first generation of computers (1945-1954 gg.) During this period, formed a set of typical structural elements that make up the computer. By this time the developers have already formed about the same idea of what elements should be a typical computer. This - the central processing unit (CPU), RAM (random access memory, or - RAM) and input-output (Peripherals). CPU, in turn, should consist of an arithmetic logic unit (ALU) and control unit (CU). Machines of this generation have worked on the tube element base, causing a huge amount of energy absorbed and were not very reliable. With their help, mainly to solve scientific problems. Applications for these machines it was possible to be not in machine language and assembly language.
The second generation of computers (1955-1964 gg.). Change of generations defined appearance of a new element base: instead of cumbersome lamps were used in the computer miniature transistors, delay lines as memory elements are replaced by magnetic core memory. This eventually led to a decrease in size, increased reliability and performance of computers. In computer architecture appeared index registers and hardware to perform floating point operations. Teams have been developed for the subroutine call.
There were high level programming languages - Algol, FORTRAN, COBOL, - created the preconditions for the emergence of portable software that does not depend on the type of computer. With the advent of high-level languages have compilers for them, a library of standard routines, and other familiar things to us now.
An important innovation is that it should be noted - is the emergence of the so-called input-output processors. These specialized processors are allowed to release the CPU from the input-output control and implement the input-output with the help of a dedicated device simultaneously with the computations. At this stage, dramatically expanded the range of users and computers has increased the range of tasks. For effective management of resources were used machine operating system (OS).
The third generation of computers (1965-1970 gg.). Generational change was due to re-upgrade hardware components: instead of transistors in the different nodes are computers used integrated circuits of varying degrees of integration. Chips are allowed to place dozens of items on the plate of a few centimeters. This, in turn, not only increased the performance of computers, but also reduced their size and cost. There is relatively inexpensive and compact machine - mini-computers. They were widely used to control various technological production processes in the systems for collecting and processing information.
Increasing the capacity of computers has made possible the simultaneous execution of multiple programs on one computer. You need to learn to co-ordinate with each other simultaneously performed actions, which were expanded functions of the operating system.
Along with the active development of hardware and architectural solutions is growing share of development in the field of programming technologies. At this time the active development of the theoretical fundamentals of programming, compiling, databases, operating systems, etc. It creates software packages for various areas of human activity.
It is now becoming a luxury to rewrite all of the programs with the advent of each new type of computer. There is a tendency to create families of computers, that is, machines are upward compatible to the software and hardware. The first of these families was a series of IBM System/360 and our domestic analogue of the computer - UCS.
The fourth generation of computers (1970-1984 gg.). Another change of the element base has led to a change of generations. In the 70 years has been actively working to create a large and very large scale integrated circuits (LSI and VLSI), which are allowed to place on a single chip tens of thousands of items. This resulted in a further significant reduction in the size and cost of computers. Work with the software became more user-friendly, resulting in an increase in the number of users.
In principle, when the extent of integration of the elements has become possible to try to create a functionally complete computers on a single chip. Relevant attempts have been made, although they have met, mostly incredulous smile. Perhaps those smiles would be less if it were possible to predict what this idea will cause extinction of a mainframe kakih-nibud fifteen years.
Nevertheless, in the early 70's was released by Intel microprocessor (MP) 4004. And if before the world's computers were only three areas (super computers, large computers (mainframes) and mini-computers), then now they added one more - microprocessor. In the general case, the processor understands functional unit computers, designed for logical and arithmetic processing based on the principle of microprogram control. On the hardware implementation of the processors can be divided into microprocessors (fully integrated with all the features of the processor), and processors with small and medium integration. Structurally, this is reflected in the fact that microprocessors implement all the functions of the CPU on one chip, and other types of processors implement them by connecting a large number of chips.
![]() |
The Intel 4004 |
![]() |
The Intel 8080 |
The fifth generation computers (1984 - today) can be called a microprocessor. Note that the fourth generation ended only in the early '80s, that is, the parents in the face of large machines and their rapidly maturing and gaining strength, "Son," For nearly 10 years have existed relatively peacefully together. For both of them this time went only benefit. Designers of large computers have gained great theoretical and practical experience, and microprocessor programmers were able to find their own, albeit initially a very narrow niche market.
![]() |
Intel 8086 |
In 1982 he was created 80286. This processor was an improved version of 8086. He supported several operating modes: real, when the formation of the address made by the rules of i8086, and secure, which implements the hardware multitasking and virtual memory management. 80 286 also had a great bit address bus - 24-bit vs. 20 in 8086, so he could address up to 16MB of RAM. The first computers based on this processor appeared in 1984. In terms of computing power that the computer has become comparable to the IBM System/370. Therefore, we can assume that on this fourth generation of computers was completed.
![]() |
Intel 80286 |
![]() |
Intel 80386 |
386 was the first microprocessor, which used parallel processing. Thus, both carried out: access to memory and input-output devices, placement of commands in the queue for their decoding, conversion of linear addresses to physical addresses, as well as paging address translation (of the 32 most frequently used pages, housed in a special cache memory).
![]() |
Intel 80486 |
Shortly after the 386 came 486. Its architecture has been further development of the idea of parallel processing. The device decode and execution teams had been organized as a five-step assembly line, the second in a different stage of execution could be up to 5 teams. The crystal was placed the cache of the first level, which contains frequently used code and data. In addition, there was a cache in the second level up to 512 KB. Now you can build a multi-processor configuration. The system processor commands have been added to the team. All of these innovations, along with a significant (up to 133 MHz) CPU clock frequency increases, significantly improved the speed of the pro gram.
Since 1993, microprocessors were produced Intel Pentium. Their appearance, marred by an error in the beginning of the block floating-point operations. This error was quickly corrected, but the distrust of these microprocessors have some time left.
Intel Pentium
![]() |
Intel Pentium |
The emergence of the microprocessor Pentium Pro has divided the market into two sectors - high-performance workstations and low-cost home computers. In the Pentium Pro processor was implemented the most advanced technology. In particular, one more was added to the pipeline available in two processor Pentium. Thus, in one cycle of the microprocessor has to perform up to three instructions.
![]() |
Intel Pentium II |
The process of developing a family of ordinary Pentium processors also not standing still. If the Pentium Pro processor computing parallelism has been implemented through the architectural and circuit solutions, creating the models of the Pentium processor took a different path. They included a new team to support several of which have changed the programming model of the microprocessor. These commands, called MMX-commands (MultiMedia eXtention - multimedia extension instruction sets) allowed simultaneous processing of multiple units of the same type of data.
![]() |
Intel Pentium III |
Processor Pentium III. Traditionally, it supports all the achievements of their predecessors, most importantly (and perhaps only?) It has the advantage - the presence of 70 new commands, these commands complete the MMX-instruction group, but for the floating-point numbers. To support these teams in the architecture of the processor was turned on a special unit.
No comments:
Post a Comment