2 - Forrai

May 5, 2018 | Author: Anonymous | Category: History, European History, Europe (1815-1915), Industrial Revolution
Share Embed Donate

Short Description

Download 2 - Forrai...


2. History of Computer Science. Early history The first computers were people. They perform the repetitive calculations to count (to calculate the total number of things or people in a group) and later to calculate – (to find out how much something will cost, how long something will take etc, by using numbers) Boredom would quickly set in, leading to carelessness, leading to mistakes. And even on your best days you wouldn't be producing answers very fast. Therefore, inventors have been searching for hundreds of years for a way to mechanize (that is, find a mechanism that can perform) this task. The abacus was an early aid for mathematical computations. The oldest surviving abacus was used in 300 B.C. by the Babylonians. A modern abacus consists of rings that slide over rods, but there was a time when pebbles were used for counting. The first gear-driven calculating machine to actually be built was probably the calculating clock, so named by its inventor, the German professor Wilhelm Schickard in 1623. In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a tax collector. Pascal built 50 of this gear-driven one-function calculator. It was extremely expensive and they really weren't that accurate. Just a few years after Pascal, the German Gottfried Wilhelm Leibniz managed to build a four-function (addition, subtraction, multiplication, and division) calculator that he called the stepped reckoner. The stepped reckoner employed the decimal number system. Leibniz was the first to advocate use of the binary number system which is fundamental to the operation of modern computers. In 1801 the Frenchman Joseph Marie Jacquard invented a power loom (szövőszék) that could base its weave upon a pattern automatically read from punched wooden cards, held together in a long row by rope. Descendents of these punched cards have been in use ever since By 1822 the English mathematician Charles Babbage was proposing a steam driven calculating machine the size of a room, which he called the Difference Engine. This machine would be able to compute tables of numbers, such as logarithm tables. The device was never finished. Babbage was not deterred, and by then was on to his next brainstorm, which he called the Analytic Engine. This device, large as a house and powered by 6 steam engines, would be more general purpose in nature because it would be programmable, thanks to the punched card technology of Jacquard. Furthermore, Babbage realized that punched paper could be employed as a storage mechanism, holding computed numbers for future reference. Babbage called the two main parts of his Analytic Engine the "Store" and the "Mill The Store was where numbers were held and the Mill was where they were "woven" into new results. In a modern computer these same parts are called the memory unit and the central processing unit (CPU). The next breakthrough occurred in America. The U.S. Constitution states that a census should be taken of all U.S. citizens every 10 years The census bureau offered a prize for an inventor to help with the 1890 census and this prize was won by Herman Hollerith. Hollerith's invention, known as the Hollerith desk, consisted of a card reader which sensed the holes in the cards, a gear driven mechanism which could count, and a large wall of dial indicators (a car speedometer is a dial indicator) to display the results of the count. Hollerith built a company, the Tabulating Machine Company which eventually became International Business Machines, known today as IBM. One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the first programmable digital computer made in the U.S. But it was not a purely electronic computer. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I operated on numbers that were 23 digits wide. It could add or subtract two of these numbers in three-tenths of a second, multiply them in four seconds, and divide them in ten seconds. One of the primary programmers for the Mark I was a woman, Grace Hopper. Hopper found the first computer "bug": a dead moth that had gotten into the Mark I and whose wings were blocking the reading of the holes in the paper tape. The word "bug" had been used to describe a defect (just like nowdays).


Another candidate for granddaddy of the modern computer was Colossus, built during World War II by Britain for the purpose of breaking the cryptographic codes used by Germany. In 1965 the work of the German Konrad Zuse was published. Zuse had built a sequence of general purpose computers in Nazi Germany. The first, the Z1, was built between 1936 and 1938. Zuse's third machine, the Z3, built in 1941, was probably the first operational, general-purpose, programmable (that is, software controlled) digital computer. The Z3 was destroyed by an Allied bombing raid. The Z1 and Z2 met the same fate and the Z4 survived. The title of forefather of today's all-electronic digital computers is usually awarded to ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC was built between 1943 and 1945 by two professors, John Mauchly and the 24 year old J. Presper Eckert, who got funding from the war department after promising they could build a machine that would replace all the “computers”. ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000 vacuum tubes. ENIAC employed paper card readers obtained from IBM. When operating, the ENIAC generated waste heat and all this heat meant that the computer could only be operated in a specially designed room with its own heavy duty air conditioning system To reprogram the ENIAC you had to rearrange the patch cords and the settings of 3000 switches. It took days to change ENIAC's program. Eckert and Mauchly's next teamed up with the mathematician John von Neumann to design EDVAC, which pioneered the stored program. JOHNNIAC was a reference to John von Neumann, who was unquestionably a genius. By the end of the 1950's computers were no longer one-of-a-kind hand built devices owned only by universities and government research labs. Eckert and Mauchly decided to set up their own company. Their first product was the famous UNIVAC computer, the first commercial (that is, mass produced) computer. In the 50's, UNIVAC (a contraction of "Universal Automatic Computer") was the household word for "computer". The first UNIVAC was sold, appropriately enough, to the Census bureau. UNIVAC was also the first computer to employ magnetic tape. Computer Generations A generation refers to the state of improvement in the development of a product and the different advancements of computer technology. The First Generation: 1946-1958 (The Vacuum Tube Years) The first generation computers were huge, slow, expensive, and often undependable. In 1946 two Americans, Presper Eckert, and John Mauchly built the ENIAC electronic computer which used vacuum tubes instead of the mechanical switches of the Mark I. The ENIAC used thousands of vacuum tubes, which took up a lot of space and gave off a great deal of heat that they had to be cooled by gigantic air conditioners. However even with these huge coolers, vacuum tubes still overheated regularly. It was time for something new. The Second Generation: 1959-1964 (The Era of the Transistor) In 1947 three scientists, John Bardeen, William Shockley, and Walter Brattain invented transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals. The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes. These transistors were made of solid material, some of which is silicon, which can be found in beach sand and glass. Therefore they were very cheap to produce. Transistors were found to conduct electricity faster and better and they were also much smaller and gave off virtually no heat compared to vacuum tubes. Their use marked a new beginning for the computer. The Third Generation: 1965-1970 (Integrated Circuits - Miniaturizing the Computer) Transistors were a tremendous breakthrough in advancing the computer. The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of silicon. Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably. These third generation computers could carry out instructions in billionths of a second. The size of these machines dropped to the size of small file cabinets.


The Fourth Generation: 1971-90 (The Microprocessor) This generation can be characterized the invention of the microprocessor (a single chip that could do all the processing of a full-scale computer). By putting millions of transistors onto one single chip more calculation and faster speeds could be reached by computers. Ted Hoff, employed by Intel (Robert Noyce's new company) invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer. The microprocessor was made to be used in calculators, not computers. It led, however, to the invention of personal computers, or microcomputers. Today we have all heard of Intel and its Pentium® Processors and now we know how it all got started. The computers of the next generation will have millions upon millions of transistors on one chip and will perform over a billion calculations in a single second. There is no end in sight for the computer movement. Fifth Generation (Present and Beyond) Artificial Intelligence Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. Others: 1975 First Personal Computer: ALTAIR (USA) 1981 IBM appears in the PC market with IBM PC 1980 Age of Microcomputers – extremely heterogeneous market offering many different types By 1990 Market became homogenous with IBM PC. IBM became the market leader in PCs: 90% sold IMB PCs and olnly 10% Apple 2000: Spread of laptops 2008: Age of Netbooks.


1. picture Schickard's Calculating Clock

2. picture Pascal's Pascaline

3. picture A 6 digit model

4. picture A Pascaline opened up so you can observe the gears and cylinders which rotated to display the numerical result

5. picture Leibniz's Stepped Reckoner (have you ever heard "calculating" referred to as "reckoning"?)


6. picture Jacquard's Loom showing the threads and the punched cards

7. picture By selecting particular cards for Jacquard's loom you defined the woven pattern

8. picture A close-up of a Jacquard card


9. picture This tapestry was woven by a Jacquard loom

10. picture A small section of the type of mechanism employed in Babbage's Difference Engine

11. picture An operator working at a Hollerith Desk like the one below


12. picture Preparation of punched cards for the U.S. census

13. picture A few Hollerith desks still exist today [photo courtesy The Computer Museum]

14. picture Computer punch card

15. picture Computer punch card

16. picture The Harvard Mark I: an electro-mechanical computer


17. picture The first computer bug

18. picture Code-breaking Colossus of Great Britain

19. picture The Zuse Z1 in its residential setting

20. picture ENIAC: the "Electronic Numerical Integrator and Calculator" (note that it wasn't even given the name of computer since "computers" were people) [U.S. Army photo]


21. picture ENIAC


View more...


Copyright � 2017 NANOPDF Inc.