Sunday, 22 December 2013

Computers and You!

abacus
 
integrated circuit




Our goal is to provide you with the best computer knowledge available—and we know exactly what that takes. Our aim is based in training. In fact, we hope to drill you up to compete with most computer science majors, so you can be Confident that the material are acurate and has been tested and proven to be effective to bring novices up to speed. If you have any suggestions on how we can improve our products or services, please contact us.

Computers are no thought smaller these days, b4 now they were much larger. Relatively computers are designed for an individual user. Today’s personal computers have more memory, more disk space and faster processing capabilities than the giant mainframes of few years gone ago. This post reviews the development of personal computers, to the best possible way any lay man can understand, ifyou have an interest in computer am sure you will find the processes amazing, if not I assure u, you are in the wrong place.

I will try to not to dive much into the number systems used by computers, in my next post I will lists the components that make up personal computers, describes programs that control computers, and organizes the tools you will need to maintain computers.

But right now….

A Brief History of Computers

The history of computational devices is full of uncertainties. Few historians can agree on who was the “first inventor” or what was the “first machine” in any number of categories. However, certain advances were so outstanding that they have become part of the folklore of computer history.

Hold on…..

Mechanical Computers

The abacus is usually listed as the first mechanical computation device. Created 2,000 or more years ago in India or the Far East, an abacus consists of columns of beads that can slide up and down on rods that are held together in a frame. The position of the beads represents a number. Skilled users could outperform early electronic computers.
 
The written number for zero appeared around 650 A.D. in India and made written calculations much easier. A Persian scholar wrote the first textbook on algebra in830 A.D. During the 1100s, Europeans learned the written form of math used by the Arabs and wrote down multiplication tables to help merchants. Five hundred years later, John Napier, a Scotsman, carved a set of multiplication tables on ivory sticks that could slide back and forth to indicate certain results. The use of logarithms on Napier’s Bones in 1617 led to the development of the slide rule.
Today’s mature engineers can still remember using slide rules in their college days (wish I did).
The Frenchman Blaise Pascal is usually given credit for the first calculating machine. In 1642, to help his father—a tax collector—with his work, Pascal invented a machine with eight metal dials that could be turned to add and subtract numbers. Leonardo da Vinci and Wilhelm Schickard, a German, designed calculating machines before Pascal, but Pascal receives the recognition because he produced fifty models of his Pascaline machine, not just a prototype or description. In 1673, Gottfried von Leibniz, a German mathematician, improved on Pascal’s design to create a Stepped Reckoner that could do addition, subtraction, multiplication, and division. A Frenchman, Thomas de Colmar, created an Arithmometer in 1820 that was produced in large numbers for the next century. A Swedish inventor, Willgodt T. Odhner improved on the Arithmometer, and his calculating mechanism was used by dozens of companies in the calculating machines they produced. Punched cards first appeared in 1801. Joseph Marie Jacquard used the holes placed in the card to control the patterns woven into cloth by power looms. In 1832, Charles Babbage was working on a Difference Engine when he realized Jacquard’s punched cards could be used in computations. The Analytical Engine, the machine Babbage designed but never manufactured, introduced the idea of memory for storing results and the idea of printed output. His drawings describeda general-purpose, fully program-controlled, automatic mechanical digital computer. Lady Ada Augusta Lovelace worked with Babbage on his machine. She became the first computer programmer when she wrote out a series of instructions for his Analytical Engine Punched cards were used in the United States census of 1890, and a data processing machine by Herman Hollerith tabulated the census results in only two and one-half years—much less than the predicted ten years. Punched cards provided input, memory, and output on an unlimited scale for business calculating machines for the next 50 years. The company Hollerith founded to manufacture his card-operated data processors, which used electrical contacts to detect the pattern of holes in each card, eventually became IBM.
Electronic Computers
With the beginning of World War II, electronic computers took on national importance. The accurate calculation of projectile trajectories became a life-and death concern for the military. The calculations needed to develop the atomic bomb also required more calculating power than was available before the war (no thanks to the war).
Between 1939 and 1944, Howard H. Aiken developed the Harvard Mark I—also known as the IBM automatic sequence controlled calculator (ASCC). The Mark I was made out of mechanical switches, electrical relays, rotating shafts, and clutches totaling 750,000 components weighing 5 tons. Programming instructions were fed to the Mark I on paper tape, and data was fed in on paper punched cards. Grace Hopper worked at Harvard on the Mark I, II, and III, and discovered the first computer “bug” when she removed a moth that had flown into a mechanical relay, causing it to malfunction. Also during the war, Konrad Zuse was working secretly on his Z3 computer in Germany. Because so little was know about the Z3 for so long, most people describe the Mark I as the first modern (but not electronic) digital computer.
Vacuum Tubes
Dr. John Vincent Atanasoff was an associate professor at Iowa State College when he designed an electronic digital computer (EDC) that would use base two (binary) numbers. In 1939, with his assistant Clifford Berry, he built the world’s first electronic digital computer using vacuum tubes. After a lecture, Dr. John W. Mauchly asked to see Atanasoff’s computer and later used so many of Atanasoff’s ideas in the ENIAC that it took a lawsuit to declare that Atanasoff was the “first” to use vacuum tubes in an electronic digital computer. Dr. Mauchly and J. Presper Eckert were at the University of Pennsylvania in 1942 when they built ENIAC (Electronic Numerical Integrator And Computer) to aid the United States military during World War II. ENIAC used 18,000 vacuum tubes, had 500,000 hand-soldered connections, was 1,000 times faster than the Mark I, and had to be rewired to change its program. ENIAC was used from 1946 to 1955, and because of its reliability is commonly accepted as the first successful high-speed electronic digital computer. Eckert and Mauchly also designed the EDVAC (Electronic Discrete Variable Automatic Computer), which contained 4,000 vacuum tubes and 10,000 crystal diodes. After their success with ENIAC, Eckert and Mauchly proposed to build a UNIVAC (Universal Automatic Computer) machine to help the Census Bureau handle all its data. After four years of delays and cost overruns, Remington Rand Inc. worked with the Eckert-Mauchly Computer Corporation to develop UNIVAC, the first commercially successful computer. The computer used magnetic tape to store data, a major change from IBM’s punched cards, and introduced many other features that are common today. Starting in 1951, 46 UNIVAC I computers were made for the government and business, although some experts at the time thought that five computers would be enough to handle all the computational needs of the world. With further advancement Transistors and Magnetic Memory replaced tubes and by 1958—1961 integrated Circuits were made
In 1970, Fairchild introduced the first 256-bit static RAM, while Intel announced the first 1024-bit dynamic RAM. Computers that could make use of this memory were still monsters to maintain. Hand-held calculators, on the other hand, appealed to everyone from scientists to school kids. Marcian “Ted” Hoff at Intel designed a general-purpose integrated circuit that could be used in calculators as well as other devices. Using ideas from this circuit, Intel introduced in 1972 the 8008, which contained approximately 3,300 transistors and was the first microprocessor to be supported by a high-level language compiler called PL/M. A microprocessor is the heart and brains of a personal computer.
A major breakthrough occurred in 1974 when Intel presented the 8080, the first general-purpose microprocessor. The 8080 microprocessor had a single chip that contained an entire programmable computing device on it. The 8080 was an 8-bit device which contained around 4,500 transistors and could perform 200,000 operations per second. Other companies besides Intel designed and produced microprocessors in the mid 1970s, including Motorola (6800), Rockwell (6502), and Zilog (Z80). As more chips appeared and the prices dropped, personal desktop computers became a possibility. Probably that explains why Intel control most hardware components of a PC even today
Personal Computers
About a dozen computers claim to be the first personal computer (PC). Credit for the first popular personal computer often goes to Ed Roberts whose company, MITS, designed a computer called the Altair 8800 and marketed a kit for about $400 in 1974. The Altair 8800 used Intel’s 8080 microprocessor, contained 256 bytes of RAM and was programmed by means of a panel of toggle switches. In 1975, Bill Gates and Paul Allen founded Microsoft and wrote a BASIC interpreter for the Altair. More than 2,000 systems were sold in 1975.
Man is insatiable, searches upon researches kept going on and today we have laptops and palmtops,  ipad and all the i-families even at that more work is ongoing, and I believe soon we will have 254bit processors and data buses.
 
 

No comments:

Post a Comment