1. Computer Literacy. Computer literacy can be defined as a working knowledge of personal
computers and commonly used computer software. Literacy refers to varying degrees of
knowledge in this context, as it does when used to indicate that an individual can read.
Understanding Computers. To help you better understand computers, we will begin our journey
back in time when computers were first used. Then we will quickly move through time to where
we are now, with automation and the advantages that computers have brought to business.
2. History of Computers. It may surprise you to know that electronic computer systems have
been around for about 50 years. The first electronic digital computer was designed and built in
the winter of 1937-38. This machine, the Atanasoff-Berry-Computer or ABC, provided the
foundation for the next advances in electronic digital computers. The first computer system put
into use was ENIAC (Electronic Numerical Integrator and Computer) in 1946. This computer
was the first large-scale electronic digital computer. The ENIAC weighted 30 tons, contained
18,000 vacuum tubes, and occupied a 30 by 50-foot space.
In 1951-52, after much discussion, IBM made the decision to add computers to their line of
business equipment products. This led IBM to become a dominant force in the computer
industry, and the public awareness of computers increased when the UNIVAC I correctly
predicted that Dwight D. Eisenhower would win the presidential election after analyzing only
five percent of the tallied vote. Also, in 1952, Dr. Grace Hopper, a mathematician and
commodore in the U.S. Navy, wrote a paper describing how to prepare a computer program with
symbolic notation instead of the detailed machine language that had been used.
Fortran (FORmula TRANslation) was introduced in 1957. This programming language proved
that efficient, easy-to-use computer languages could be developed. FORTRAN is still in use.
In 1958, computers built with transistors marked the beginning of the second generation of
computer hardware. Previous computers built with vacuum tubes are called first-generation
machines. By 1959, over 200 programming languages had been created.
In 1960, COBOL (COmmon Business Oriented Language), a business application language, was
introduced. COBOL, which uses English-like phrases, can be ran on most brands of computers,
making it one of the most widely used languages in the world. From 1958 to 1964, it is estimated
that the number of computers in the U.S. grew from 2,500 to 18,000.
The third-generation computers were introduced in 1964. Their controlling circuitry is stored on
chips. The families of IBM System 360 computers were the first third-generation machines.
Digital Equipment Corporation introduced the first minicomputer in 1965, and the development
of the BASIC (Beginner's All-purpose Symbolic Instructional Code) programming language,
The 1960s saw the birth of the software industry. In 1968, Computer Science Corporation
became the first software company. In 1969, the first microprocessor or microprogrammable
computer chip, the Intel 4004, was developed.