0% found this document useful (0 votes)
36 views10 pages

Tle 10 2023

A computer is an electronic device that collects information, stores it, processes it according to instructions, and returns results. Early computing devices included the abacus, Napier's Bones, and the Pascaline. The history of computers progressed from mechanical to electronic devices through several generations using different technologies.

Uploaded by

Me Rylle
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views10 pages

Tle 10 2023

A computer is an electronic device that collects information, stores it, processes it according to instructions, and returns results. Early computing devices included the abacus, Napier's Bones, and the Pascaline. The history of computers progressed from mechanical to electronic devices through several generations using different technologies.

Uploaded by

Me Rylle
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 10

What is a Computer?

A computer is an electronic machine that collects information, stores it,


processes it according to user instructions, and then returns the result.

A computer is a programmable electronic device that performs


arithmetic and logical operations automatically using a set of
instructions provided by the user.

Early Computing Devices


People used sticks, stones, and bones as counting tools before
computers were invented. More computing devices were produced as
technology advanced and the human intellect improved over time. Let
us look at a few of the early-age computing devices used by mankind.

1. Abacus
Abacus was invented by the Chinese around 4000 years ago. It’s a
wooden rack with metal rods with beads attached to them. The abacus
operator moves the beads according to certain guidelines to complete
arithmetic computations.

2. Napier’s Bone
John Napier devised Napier’s Bones, a manually operated calculating
apparatus. For calculating, this instrument used 9 separate ivory strips
(bones) marked with numerals to multiply and divide. It was also the
first machine to calculate using the decimal point system.

3. Pascaline
Pascaline was invented in 1642 by Biaise Pascal, a French
mathematician and philosopher. It is thought to be the first mechanical
and automated calculator. It was a wooden box with gears and wheels
inside.

4. Stepped Reckoner or Leibniz wheel


In 1673, a German mathematician-philosopher named Gottfried
Wilhelm Leibniz improved on Pascal’s invention to create this
apparatus. It was a digital mechanical calculator known as the stepped
reckoner because it used fluted drums instead of gears.

5. Difference Engine
In the early 1820s, Charles Babbage created the Difference Engine. It
was a mechanical computer that could do basic computations. It was a
steam-powered calculating machine used to solve numerical tables such
as logarithmic tables.

6. Analytical Engine 
Charles Babbage created another calculating machine, the Analytical
Engine, in 1830. It was a mechanical computer that took input from
punch cards. It was capable of solving any mathematical problem and
storing data in an indefinite memory.

7. Tabulating machine 
An American Statistician – Herman Hollerith invented this machine in
the year 1890. Tabulating Machine was a punch card-based mechanical
tabulator. It could compute statistics and record or sort data or
information. Hollerith began manufacturing these machines in his
company, which ultimately became International Business Machines
(IBM) in 1924.

8. Differential Analyzer 
Vannevar Bush introduced the first electrical computer, the Differential
Analyzer, in 1930. This machine is made up of vacuum tubes that
switch electrical impulses in order to do calculations. It was capable of
performing 25 calculations in a matter of minutes.

9. Mark I 
Howard Aiken planned to build a machine in 1937 that could conduct
massive calculations or calculations using enormous numbers. The
Mark I computer was constructed in 1944 as a collaboration between
IBM and Harvard.

History of Computers Generation


The word ‘computer’ has a very interesting origin. It was first used in
the 16th century for a person who used to compute, i.e. do calculations.
The word was used in the same sense as a noun until the 20th century.
Women were hired as human computers to carry out all forms of
calculations and computations.

By the last part of the 19th century, the word was also used to describe
machines that did calculations. The modern-day use of the word is
generally to describe programmable digital devices that run on
electricity.
Early History of Computer

Since the evolution of humans, devices have been used for calculations
for thousands of years. One of the earliest and most well-known devices
was an abacus. Then in 1822, the father of computers, Charles
Babbage began developing what would be the first mechanical
computer. And then in 1833 he actually designed an Analytical Engine
which was a general-purpose computer. It contained an ALU, some
basic flow chart principles and the concept of integrated memory.

Then more than a century later in the history of computers, we got our
first electronic computer for general purpose. It was the ENIAC, which
stands for Electronic Numerical Integrator and Computer. The inventors
of this computer were John W. Mauchly and J.Presper Eckert.

And with times the technology developed and the computers got
smaller and the processing got faster. We got our first laptop in 1981
and it was introduced by Adam Osborne and EPSON.

Browse more Topics under Basics Of Computers

 Number Systems
 Number System Conversions
 Generations of Computers
 Computer Organisation
 Computer Memory
 Computers Abbreviations
 Basic Computer Terminology
 Computer Languages
 Basic Internet Knowledge and Protocols
 Hardware and Software
 Keyboard Shortcuts
 I/O Devices
 Practice Problems On Basics Of Computers

Generations of Computers
In the history of computers, we often refer to the advancements of
modern computers as the generation of computers. We are currently on
the fifth generation of computers. So let us look at the important
features of these five generations of computers.

 1st Generation: This was from the period of 1940 to 1955.


This was when machine language was developed for the use of
computers. They used vacuum tubes for the circuitry. For the
purpose of memory, they used magnetic drums. These machines
were complicated, large, and expensive. They were mostly
reliant on batch operating systems and punch cards. As output
and input devices, magnetic tape and paper tape were
implemented. For example, ENIAC, UNIVAC-1, EDVAC, and
so on.
 2nd Generation: The years 1957-1963 were referred to as the
“second generation of computers” at the time. In second-
generation computers, COBOL and FORTRAN are employed
as assembly languages and programming languages. Here they
advanced from vacuum tubes to transistors. This made the
computers smaller, faster and more energy-efficient. And they
advanced from binary to assembly languages. For instance,
IBM 1620, IBM 7094, CDC 1604, CDC 3600, and so forth.
 3rd Generation: The hallmark of this period (1964-1971) was
the development of the integrated circuit.  A single integrated
circuit (IC) is made up of many transistors, which increases the
power of a computer while simultaneously lowering its cost.
These computers were quicker, smaller, more reliable, and less
expensive than their predecessors. High-level programming
languages such as FORTRON-II to IV, COBOL, and PASCAL
PL/1 were utilized. For example, the IBM-360 series, the
Honeywell-6000 series, and the IBM-370/168.
 4th Generation: The invention of the microprocessors brought
along the fourth generation of computers. The years 1971-1980
were dominated by fourth generation computers. C, C++ and
Java were the programming languages utilized in this
generation of computers. For instance, the STAR 1000, PDP 11,
CRAY-1, CRAY-X-MP, and Apple II. This was when we
started producing computers for home use.
 5th Generation: These computers have been utilized since
1980 and continue to be used now. This is the present and the
future of the computer world. The defining aspect of this
generation is artificial intelligence. The use of parallel
processing and superconductors are making this a reality and
provide a lot of scope for the future. Fifth-generation computers
use ULSI (Ultra Large Scale Integration) technology. These are
the most recent and sophisticated computers. C, C++, Java,.Net,
and more programming languages are used. For instance, IBM,
Pentium, Desktop, Laptop, Notebook, Ultrabook, and so on.

Brief History of Computers


The naive understanding of computation had to be overcome before the
true power of computing could be realized. The inventors who worked
tirelessly to bring the computer into the world had to realize that what
they were creating was more than just a number cruncher or a
calculator. They had to address all of the difficulties associated with
inventing such a machine, implementing the design, and actually
building the thing. The history of the computer is the history of these
difficulties being solved.
19th Century

1801 – Joseph Marie Jacquard, a weaver and businessman from France,


devised a loom that employed punched wooden cards to automatically
weave cloth designs.

1822 – Charles Babbage, a mathematician, invented the steam-powered


calculating machine capable of calculating number tables. The
“Difference Engine” idea failed owing to a lack of technology at the
time.

1848 – The world’s first computer program was written by Ada


Lovelace, an English mathematician. Lovelace also includes a step-by-
step tutorial on how to compute Bernoulli numbers using Babbage’s
machine.

1890 – Herman Hollerith, an inventor, creates the punch card technique


used to calculate the 1880 U.S. census. He would go on to start the
corporation that would become IBM.

Early 20th Century

1930 – Differential Analyzer was the first large-scale automatic


general-purpose mechanical analogue computer invented and built by
Vannevar Bush.

1936 – Alan Turing had an idea for a universal machine, which he


called the Turing machine, that could compute anything that could be
computed.

1939 – Hewlett-Packard was discovered in a garage in Palo Alto,


California by Bill Hewlett and David Packard.
1941 – Konrad Zuse, a German inventor and engineer, completed his
Z3 machine, the world’s first digital computer. However, the machine
was destroyed during a World War II bombing strike on Berlin.

1941 – J.V. Atanasoff and graduate student Clifford Berry devise a


computer capable of solving 29 equations at the same time. The first
time a computer can store data in its primary memory.

1945 – University of Pennsylvania academics John Mauchly and J.


Presper Eckert create an Electronic Numerical Integrator and Calculator
(ENIAC). It was Turing-complete and capable of solving “a vast class
of numerical problems” by reprogramming, earning it the title of
“Grandfather of computers.”

1946 – The UNIVAC I (Universal Automatic Computer) was the first


general-purpose electronic digital computer designed in the United
States for corporate applications.

1949 – The Electronic Delay Storage Automatic Calculator (EDSAC),


developed by a team at the University of Cambridge, is the “first
practical stored-program computer.”

1950 – The Standards Eastern Automatic Computer (SEAC) was built


in Washington, DC, and it was the first stored-program computer
completed in the United States.

Late 20th Century

1953 – Grace Hopper, a computer scientist, creates the first computer


language, which becomes known as COBOL, which stands
for COmmon, Business-Oriented Language. It allowed a computer user
to offer the computer instructions in English-like words rather than
numbers.
1954 – John Backus and a team of IBM programmers created the
FORTRAN programming language, an acronym
for FORmula TRANslation. In addition, IBM developed the 650.

1958 – The integrated circuit, sometimes known as the computer chip,


was created by Jack Kirby and Robert Noyce.

1962 – Atlas, the computer, makes its appearance. It was the fastest
computer in the world at the time, and it pioneered the concept of
“virtual memory.”

1964 – Douglas Engelbart proposes a modern computer prototype that


combines a mouse and a graphical user interface (GUI).

1969 – Bell Labs developers, led by Ken Thompson and Dennis


Ritchie, revealed UNIX, an operating system developed in the C
programming language that addressed program compatibility
difficulties.

1970 – The Intel 1103, the first Dynamic Access Memory (DRAM)
chip, is unveiled by Intel.

1971 – The floppy disc was invented by Alan Shugart and a team of
IBM engineers. In the same year, Xerox developed the first laser
printer, which not only produced billions of dollars but also heralded the
beginning of a new age in computer printing.

1973 – Robert Metcalfe, a member of Xerox’s research department,


created Ethernet, which is used to connect many computers and other
gear.
1974 – Personal computers were introduced into the market. The first
were the Altair Scelbi & Mark-8, IBM 5100, and Radio Shack’s TRS-
80.

1975 – Popular Electronics magazine touted the Altair 8800 as the


world’s first minicomputer kit in January. Paul Allen and Bill Gates
offer to build software in the BASIC language for the Altair.

1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak,


who expose the world to the Apple I, the first computer with a single-
circuit board.

1977 – At the first West Coast Computer Faire, Jobs and Wozniak
announce the Apple II. It has colour graphics and a cassette drive for
storing music.

You might also like