Module CC 101 Introduction To Computing
Module CC 101 Introduction To Computing
Module CC 101 Introduction To Computing
: 00
VISION MISSION
A center of human development committed to the pursuit of wisdom, truth, Establish and maintain an academic environment promoting the pursuit of
justice, pride, dignity, and local/global competitiveness via a quality but excellence and the total development of its students as human beings,
affordable education for all qualified clients. with fear of God and love of country and fellowmen.
GOALS
Kolehiyo ng Lungsod ng Lipa aims to:
foster the spiritual, intellectual, social, moral, and creative life of its client via affordable but quality tertiary education;
provide the clients with reach and substantial, relevant, wide range of academic disciplines, expose them to varied curricular and co-
curricular experiences which nurture and enhance their personal dedications and commitments to social, moral, cultural, and economic
transformations.
Learning Module
in
CC 101
Introduction to Computing
Name: _________________________________________________________
(Last name, First Name MI.)
IV. ENGAGEMENT:
Before we begin discussing things about the different industries that cater to your course (BSCS or
ACT) let’s discuss first some important things about computing.
WHAT IS COMPUTING?
The history of computing is longer than the history of computing hardware and modern
computing technology and includes the history of methods intended for pen and paper or for
chalk and slate, with or without the aid of tables.
Computing is intimately tied to the representation of numbers. But long before abstractions
like the number arose, there were mathematical concepts to serve the purposes of civilization.
[clarification needed] These concepts include one-to-one correspondence (the basis of counting),
comparison to a standard (used for measurement), and the 3-4-5 right triangle (a device for
assuring a right angle).
The earliest known tool for use in computation was the abacus, and it was
thought to have been invented in Babylon circa 2400 BC. Its original style of
usage was by lines drawn in sand with pebbles. Abaci, of a more modern
design, are still used as calculation tools today. This was the first known
calculation aid - preceding Greek methods by 2,000 years.
The first recorded idea of using digital electronics for computing was the 1931 paper "The
Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-
Williams. Claude Shannon's 1938 paper "A Symbolic Analysis of Relay and Switching Circuits"
then introduced the idea of using electronics for Boolean algebraic operations.
SOFTWARE INDUSTRY
V. ACTIVITY
Do you
Have own
you something
done somethingthatthat
youyou
would consider
would as a
consider as
computer? Ifbased
“computing” yes, what is it?
on the definition stated above?
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
VII. EVALUATION
Instruction: Multiple Choice. Choose the letter of the word that best matches the definition stated
on each number. Write your answer on the right column blank. 1 point each.
2. It was founded in 1947 and is the world's largest scientific and educational
computing society.
a. IEEE c. INTEL 2.
_______
b. ACM d. MICROSOFT
4. A complete computer.
a. Computer Hardware c. Operating System
4. ______
b. Computer Software d. Computer System
Rubrics:
Scor Grade Equivalent
Criteria
e
10 OUTSTANDING
100%
9 OUTSTANDING
96%
8 EXCELLENT
92%
7 EXCELLENT
88%
6 VERY GOOD
84%
5 VERY GOOD
80%
REFERENCES:
https://en.wikipedia.org/wiki/Computing
IV. ENGAGEMENT:
SUB-DISCIPLINES OF COMPUTING
COMPUTER ENGINEERING
SOFTWARE ENGINEERING
Software engineering (SE) is the application of a systematic, disciplined, quantifiable
approach to the design, development, operation, and maintenance of software, and the study of
these approaches; that is, the application of engineering to software. In layman's terms, it is the
act of using insights to conceive, model and scale a solution to a problem. The first reference to
the term is the 1968 NATO Software Engineering Conference and was meant to provoke thought
regarding the perceived "software crisis" at the time. Software development, a much used and
more generic term, does not necessarily subsume the engineering paradigm. The generally
accepted concepts of Software Engineering as an engineering discipline have been specified in
the Guide to the Software Engineering Body of Knowledge (SWEBOK). The SWEBOK has
become an internationally accepted standard ISO/IEC TR 19759:2015.
COMPUTER SCIENCE
Computer science or computing science (abbreviated CS or Comp Sci) is the scientific
and practical approach to computation and its applications. A computer scientist specializes in
the theory of computation and the design of computational systems.
Its subfields can be divided into practical techniques for its implementation and application in
computer systems and purely theoretical areas. Some, such as computational complexity theory,
which studies fundamental properties of computational problems, are highly abstract, while
others, such as computer graphics, emphasize real-world applications. Still others focus on the
challenges in implementing computations. For example, programming language theory studies
approaches to description of computations, while the study of computer programming itself
investigates various aspects of the use of programming languages and complex systems, and
human–computer interaction focuses on the challenges in making computers and computations
useful, usable, and universally accessible to humans.
INFORMATION SYSTEMS
INFORMATION TECHNOLOGY
Information technology (IT) is the application of computers and telecommunications
equipment to store, retrieve, transmit and manipulate data, often in the context of a business or
other enterprise. The term is commonly used as a synonym for computers and computer
networks, but it also encompasses other information distribution technologies such as television
and telephones. Several industries are associated with information technology, such as computer
hardware, software, electronics, semiconductors, internet, telecom equipment, e-commerce and
computer services.
SYSTEMS ADMINISTRATION
A system administrator, IT systems administrator, systems administrator, or sysadmin is a
person employed to maintain and operate a computer system or network. The duties of a system
administrator are wide-ranging and may vary substantially from one organization to another.
Sysadmins are usually charged with installing, supporting and maintaining servers or other
computer systems, and planning for and responding to service outages and other problems.
Other duties may include scripting or light programming, project management for systems-related
projects, supervising or training computer operators, and being the consultant for computer
problems beyond the knowledge of technical support staff.
V. ACTIVITY
Marawoy,Of
Lipacourse, the 4217
City, Batangas careers discussed are just options,
still you
| https://www.facebook.com/KLLOfficial/
can decide what you want to be after you graduate and that
is?
VI. OUTPUT(RESULT)
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
NAME:________________________________ DATE ACCOMPLISHED:___________
COURSE / YR & SEC: ________________________
VII. EVALUATION
Instruction: IDENTIFICATION. Determine what is being asked from the each of the following
statements. Write your answer for each statement on the right column blank. (1 point each
number).
1. A discipline that integrates several fields of electrical
engineering and computer science required to develop
1._____________________
computer hardware and software.
Rubrics:
Scor Grade Equivalent
Criteria
e
10 OUTSTANDING
100%
9 OUTSTANDING
96%
8 EXCELLENT
92%
7 EXCELLENT
88%
6 VERY GOOD
84%
5 VERY GOOD
80%
4 AVERAGE
76%
3 AVERAGE
72%
2 POOR
68%
1 POOR
64%
0 NEEDS COUNSELLING
60%
https://en.wikipedia.org/wiki/Computing
https://www.acm.org/education/resources-for-grads
IV. ENGAGEMENT:
https://www.youtube.com/watch?
v=_U21fT8VLp0
https://www.youtube.com/watch?
v=P7fi4hP_y80
https://www.youtube.com/watch?
v=FdipJNG_vV8
4. Design Systems - Computer Aided Design - There exist many computer programs
used to design the model of a product on the computer system. This process is called Computer
Aided Design or CAD. This is due to Computer Aided Design techniques that we can test the
https://www.youtube.com/watch?
v=EPwOgh-M_ok
When a business can keep in touch with its clients, it becomes easier for the clients to make
inquiries of the business or to ask for more information about the services and products the
business offers. It also becomes easier for the business to offer customer support to its clients in
a timely, efficient manner. The business will also be able to keep the clients updated about any
new developments concerning the business.
Communication goes beyond a business' clients. A business also needs to communicate with its
employees, and computers play an important role. Rather than have time-wasting one-on-one
meetings with employees, managers can simply email their employees or they can message
them on any other acceptable communication platform. This saves time, and it also improves the
internal communication of the business.
V. ACTIVITY
ENUMERATION: List down what is asked in the given statement. Write your answer on the
space provided.
1. USES OF COMPUTER IN INDUSTRY
a) __________________________
b) __________________________
c) __________________________
d) __________________________
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
VII. EVALUATION
Instruction: MATCHING TYPE. Match Column A with Column B. Choose the letter of
word/statement in Column A that best matches the word/statement in Column B. Write the letter
of your choice on the right column blank. (1 point each number).
Column A Cloumn B Answer
1. Computer-Controlled Robots A. Sports 1._______
2. Automated Production Systems B. Perform complicated jobs 2._______
3. Computer-aided manufacturing C. On-line class 3._______
4. Computer Aided Design D. Data-mining 4._______
5. Communication E. Spray painting and welding 5._______
6. Marketing F. Automated machines 6._______
7. Accounting G. Designing a model 7._______
8. Storage H. Exchanging information 8._______
9. Producing Documents I. Selling products 9._______
10. Education J. Computations 10.______
11. Research K. Saving files 11.______
Rubrics:
REFERENCES
https://smallbusiness.chron.com/uses-computers-business-56844.html
https://en.wikipedia.org/wiki/Computing
THE HARDWARE
The term computer dates back to the 1600s. However, until the 1950s, the term referred almost
exclusively to a human who performed computations. For human beings, the task of performing
large amounts of computation is one that is laborious, time consuming, and error prone. Thus,
the human desire to mechanize arithmetic is an ancient one.
One of the earliest devices developed for simplifying human arithmetic was the abacus already in
use in ancient
Watch: https://www.youtube.com/watch?v=5wB80HzF8sM
Watch: https://www.youtube.com/watch?v=bGk9W65vXNA
On the other hand, binary digits – also known as “bits” -- are based on powers of 2, where
every digit one moves to the left represents another power of 2: ones (2 0), twos (21), fours (102),
eights (103), sixteens (104), etc. Thus, in binary, the number eighteen would be written in Base-2
as 10010, understood arithmetically as the sum of 1 sixteen, 0 eights, 0 fours, 1 two, and 0 ones:
Why on earth would computer engineers choose to build a machine to do arithmetic using such a
cryptic, unfamiliar form of writing numbers as a binary, Base-Two numeral scheme? Here’s why.
In any digital numeral system, each digit must be able to count up to one less than the base.
Thus, in the case of the Base-10 system, counting sequence of each decimal digit runs from 0 up
to 9, and then back to 0. To represent a decimal digit, then, one must be able to account for all
10 possibilities in the counting sequence, 0 through 9, so one must either use a device with ten
possible states, like the ten-position gear used in the Pascaline, or ten separate devices, like the
ten separate vacuum tubes used for each digit in the ENIAC.
However, the binary numeral system is Base-2. Thus, given that its digits also need only to be
able to count as high as one less than the base, this means that the counting sequence of each
binary digit runs from 0 only up to 1, and then back again to 0 already. In other words, whereas
ten different numbers can appear in a decimal digit, 0 through 9, the only number that will ever
appear in a binary digit is a 0 or a 1. Thus, rather than having to account for the 10 possibilities
of a decimal digit, one can represent a binary digit with only a single device that has two possible
states. For example, one could represent each binary digit with a simple on/off switch, where the
“on” position represents a 1 and the “off” position represents a 0:
Likewise, the number “two hundred fifty-five” could be represented with only 8 vacuum tubes,
instead of the 30 that ENIAC required:
Watch: https://www.youtube.com/watch?v=thrx3SBEpL8
https://www.youtube.com/watch?v=Xpk67YzOn5w
1. PROCESSORS
It is fairly easy to acquire a basic understanding of how a line of interlocking, 10-position
gears can mimic the operations of decimal arithmetic. But it is far less obvious how an array of
vacuum tubes or transistors, used as electronic on-off switches, mimic the operations of binary
arithmetic.
One helpful analogy is that of a set of dominoes. Imagine a domino exhibition on a late-night talk
show, where a domino champion sets up an elaborate maze of dominoes, knocks one of them
over, and sets off an elaborate chain reaction of falling dominoes, lasting several minutes.
Eventually, the sequence of falling dominoes reaches the end, and the last set of dominoes
tumble over in a grand flourish. Similarly, imagine a set of dominoes on a table where there is a
line of eight dominoes at one end, and another line of eight dominoes at the other end, with a
maze of other dominoes in between. If you were to go to the eight dominoes at one end and
knock over some or all of them, this would set off a chain reaction of falling dominoes in the maze
laid out until, eventually, this chain reaction stopped at the other end where some or all of those
There are some similarities here to the way a processor works. A domino, like a
transistor, is a two state device: just as a transistor can be in either an on or off
position, a domino can be either standing up or lying down. Thus, like any other
would take place, and a resulting 8-digit binary number would be output.
Thus, even today, a modern electronic digital computer is still, at the core of its hardware,
a machine that performs basic arithmetic operations. More specifically, it is a machine that
mimics or models the way that digits change when humans do basic arithmetic. What is
remarkable about the way that today's computers model arithmetic is their extraordinary speed in
doing so. Today's microprocessors are typically 32 bits or higher, meaning that their instructions
are comprised of binary numbers that are 32 or more digits. Their instruction cycles are
described in "gigahertz," meaning that such processors can perform literally billions of instruction
cycles every second.
V. ACTIVITY
DEFINITION: Define the following terms below. Write your answer on the space provided.
a) Abacus -
VI. OUTPUT(RESULT)
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
VII. EVALUATION
Rubrics:
REFERENCES
https://cs.calvin.edu/activities/books/processing/text/01computing.pdf
Babbage planned to feed into his Analytical Engine sequences of metal cards with holes
punched into them. Instead of being used to define a sequence of threads to incorporate into a
particular weave, the punched cards would be used to define a sequence of basic arithmetic
operations for the Analytical Engine to perform that, together, achieved a desired mathematical
result. In other words, unlike previous calculating machines, the Analytical Engine would be
programmable: just as a single automated loom could perform different weaves simply by
switching sets of punched cards, so Babbage’s Analytical Engine would be able to switch
between different mathematical computations simply by changing the set of punched cards. To a
remarkable degree, the Analytical Engine anticipated the fundamental "architecture" of the
modern electronic computer in that it was organized into the four primary subsystems of
processing, storage, input, and output.
Ada Lovelace, daughter of Lord Byron, was one of the few people other than Babbage
who understood the Analytical Engine’s enormous potential. She described the similarity of
Jacquard’s and Babbage’s inventions: ―The Analytical Engine weaves algebraic patterns just as
the Jacquard loom weaves flowers and leaves in both cases, simply by performing a carefully
devised sequence of basic operations. Lovelace designed and wrote out demonstrations of how
complex mathematical computations could be constructed entirely from sequences of the basic
set of arithmetic operations of which the Analytical Engine would be capable. Ada Lovelace is
Lovelace also noted that a good computer program is one that is general, in the sense that
the designed sequence of operations should be able to ―be performed on an infinite variety of
particular numerical values‖ rather than being designed to operate upon only a specific set of
operands. For example, rather than designing a program to perform
(2 x 10) - 5
a program that is "general‖ would be one designed to take in any three numbers, multiply the first
two, and then subtract the third number from the result. This illustrates what Lovelace described
as operations that are "independent" from ―the objects operated upon. In modern computing,
this is often described in terms of the separation that is maintained between the "data" and the
operations upon that data.
Lovelace also described what she called ―cycles of operations, where a certain desired result
could be achieved by performing a certain subset of operations repeatedly. In modern computer
programming, this is called a "loop." One simple illustration of this is the way that any
multiplication operation
5x4
can also be achieved by repeatedly performing a single addition operation: 5 + 5 + 5 + 5
This remarkable capability of the computer to automatically perform cycles of operation,
with the results building upon each other, is part of the reason why Lovelace boldly claimed that
such a machine would in fact be able to perform computations that had not ever ―been actually
worked out‖ previously by any human. This also underscores an important aspect of
programming that is sometimes overlooked: namely, that the process of programming seldom
consists of the mere encoding in a programming language of a fully envisioned idea of the
software that was already completely worked out, conceptually. Rather, the process of
programming is one that entails exploration, experimentation, discovery, creativity, and invention.
As we now know, computers can do so much more than the kind of arithmetic
computations that are involved in doing one's taxes. This is because computers have indeed
motivated software programmers to discover just how many actual and imaginary facts and
entities can, to at least some degree, be modeled in the form of symbols that can be manipulated
in ways that mimic actions, processes, phenomena, and relationships.
Object-oriented programming (OOP) emerged largely from attempts in the latter half of
the 1970s to develop a new generation of “personal computers and ―graphical user interfaces”
and the rapid rise in popularity of these technologies beginning in the latter half of the 1980s was
accompanied by a similar rise to prominence of the object-oriented notions of programming that
enabled them.
The Dynabook's GUI also enabled the user to operate in multiple onscreen "windows” that
could provide different views simultaneously. The data created and manipulated by the user was
represented on the screen in the form of interactive virtual "objects": passages of text, drawings,
photographs, sounds and music, descriptive icons, and much more. Such a GUI was designed
to create an interactive on-screen representation of an environment for action that would be
immediately recognized and easily manipulated by the user. However, the Smalltalk software
system of the Dynabook was also a graphical objectoriented programming environment. The
driving vision of Kay’s team was one in which Dynabook users would begin by interacting with
intuitive software created by others but, when they were ready, they would also be able to
examine and even change the defined characteristics and behaviors of the virtual "objects” that
comprised these software programs. In fact, the goal was to design a version of the Smalltalk
programming language that was so intuitive that even children using the Dynabook might
eventually decide to author their own software programs. Early tests with Palo Alto school
children had proven to be very promising in this regard.
Like other early microcomputers, initially the Apple II was designed and marketed on the
assumption that users would usually write their own software (probably using one of the
introductory procedural programming languages of that era, such as the widely-used ―BASIC‖).
However, another key event had taken place in 1979, one that subsequently propelled the Apple
II to heights of popularity that went far beyond computer hobbyists and programmers.
Spreadsheet software provided novice computer users with the ability to undertake the
kind of computational operations on numerical and textual data that were so strongly associated
with computers without having to learn what they would have viewed as a “real” programming
language. In fact, especially in light of the overall history of computing, it does not seem to be too
much of a stretch to suggest that spreadsheet software provided a new kind of computer
programming environment. Novice users certainly found the designing of computations within
What seems to have been completely lost in the Apple’s and Microsoft's adaptations of
PARC’s Dynabook vision was the idea of enabling novice computer users not only to use
software written by others but also to subsequently proceed comfortably along a path of learning
that would lead to an ability to begin to modify those software programs and, eventually, to create
simple software programs of their own. This was central to Alan Kay's conception of the personal
computer as a medium, one that provided power to construct unique kinds of models of ideas
that could be experienced as a kind of performance. In fact, Kay credited the notions of both the
theatrical and musical performance as having been highly influential upon the design of the
Dynabook. Kay has said that he envisioned the personal computer as something like ―an
instrument whose music is ideas.‖ In contrast, Kay has said, the version of personal computing
inaugurated by Apple and Microsoft in the 1980s has made for two decades of a user experience
that is more like "air guitar." 1 Instead of using the personal computer as a medium in which to
design and stage performances of our own ideas, we instead perform within models of user
experience that have been designed by others, ones that are often troublingly generic,
constraining, and difficult with which to identify.
Indeed, along with the installation of the GUI of the Macintosh and Windows GUI as the
dominant interface paradigm came a new conception of the computer "user" as the "end user"
who, by definition, was a non-programmer. This conception of “personal computing” as being the
exact opposite of computer programming has persisted for more than two decades.
However, part of the reason why the Macintosh and Windows operating systems were
ultimately such severe reductions in the PARC vision of personal computing, in spite of the many
similarities of the GUIs, is that it simply couldn’t be fully achieved affordably on the hardware
technologies available at the time. Moreover, in spite of early successes, PARC researchers
themselves came to realize that their vision of a version of Smalltalk as a programming language
that would be intuitive had proven to be more difficult to achieve than expected.
Perhaps it must be said that something of this sort has happened in regard to the kind of
“binary” relationship that has come to be constructed in public perceptions of computer
programming vs. personal computing. Indeed, at times, this diametric relationship even reaches
the point of bigotry, where end users are derided as ignorant by programmers, and programmers
are labeled as “geeks” by end users.
However, this new millennium has also come to see a renewed, gradually increasing
interest in the idea of programming on the part of persons whose passionate use of personal
computer software has led to a desire to break out of the end user “box” in order to customize
their user experience or to invent new forms of user experience entirely. This is especially true in
the case of certain areas of computing that have come to be known by such varying names as
“digital art,” “digital imaging,” ”digital media,” and “digital design,” where the personal computer is
explicitly regarded as an artistic medium. Given that computer programming tends to be so
strongly associated with numerical data of math, science, and business, it might initially seem
surprising that persons who approach computing from a more artistic direction would be inclined
toward learning to program. On the other hand, it is also the case that persons who undertake
more artistic and expressive design processes are less likely to be accepting of the more generic
and limiting workflows inherent in most commercially available application software. In this
sense, perhaps it should not be surprising that so many digital photographers and designers of
digital graphics, for example, should undertake to customize the applications they use by
designing their own ―filters‖ and ―extensions‖ for such software, in spite of the degree of
difficulty that is often involved in developing such customizations. For some, this artistic drive
even leads them to teach themselves a standard programming language, which is certainly the
most difficult method of acquiring such knowledge. In fact, the ―Processing‖ programming
language that is used in this book was originally developed in 2001 at the Massachusetts Institute
of Technology for the purpose of making it easier for artists to create dynamic graphical art on a
computer by combining images, animation, and interactions.
VIII. ACTIVITY
Acronyms: Give the corresponding meaning and definition of the below acronyms.
1. GUI
IX. OUTPUT(RESULT)
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
X. EVALUATION
Instruction: Multiple Choice. Choose the letter of the word that best matches the definition stated
on each number. Write your answer on the right column blank. 1 point each.
1. It is computer _________ that provides such meaningful, useful
direction.
1. __________
a) Software c) System
b) Hardware d) CPU
REFERENCES
https://cs.calvin.edu/activities/books/processing/text/01computing.pdf
As was noted earlier, a modern electronic computer remains, at its most fundamental
hardware level, a binary, digital arithmetic machine. The “processor” that is at its core operates
on “data” that are understood to be binary numbers. Each digit of such a data item consists of
either a 0 or 1 and is represented and sent to the processor in the form of an electrical signal with
a voltage that is evaluated to be either “high” or “low” which in turn serves to “flip” certain of the
processor’s on/off transistor switches in such a way that mimics the “inputting” of this binary
number into the processor.
High-level programming languages also allow data that consists of letters and punctuation
marks and include commands that are very similar to words used in human languages. For
example, in a typical high-level programming language, an instruction to print the greeting “Hello!”
on the screen could be something like:
print(“Hello!”)
This translation of software written in a high-level language into binary instructions that can
be sent to the processor is called “compiling” a program. Instructions written in the high-level
programming language, often called the “source code” of that software program, are fed into a
special translation software program called a “compiler” which is designed to convert source code
written in a particular high-level programming language into the needed sequence of binary
However, there have always been certain complications inherent in the compiling of
software programs. First of all, the set of binary instructions that can be sent to a processor can
vary from one kind of processor to another. As a result, source code that has been compiled for
one kind of processor chip might not be executable on a different kind of processor chip. This is
a big part of the reason why personal computer application software such as a word processor
will typically be made available in separate “Macintosh” and “PC” versions: because the binary
instructions in the executable code are always sent to a processor at the direction of a particular
operating system, and the Macintosh and Windows operating systems themselves are software
programs that have been designed and compiled for totally different categories of processor
chips.2 This is in contrast to the very similar “Unix” and “Linux” family of operating systems which
can be compiled to operate on an enormous variety of processor chips, including the kinds of
processor chips found in both Macintosh and Windows computers.
PLATFORM INDEPENDENCE
The explosion in the popularity of the World Wide Web beginning in the latter half of the
1990s also served to underscore the problem of incompatible personal computer “platforms.”
From the very outset, the Internet was designed to be a “platform-independent” infrastructure.
Thus, for example, because the “JPEG” digital image format was designed specifically for use on
the World Wide Web, it was designed to be platform-independent. As a result, any given JPEG
image file can be accessed, viewed, manipulated, and exchanged over the Internet by users of
Macintosh, Windows/PC, and Unix/Linux personal computers alike.
The World Wide Web’s platform-independent, multimedia data formats served also to
provide an impetus for exploring the feasibility of developing platform-independent software
programs. Indeed, this is much of the reason why the platform-independent “Java” programming
language has risen to such heights of popularity over the course of the past decade. The source
code of a software program written in Java is not compiled directly into the binary code for a
specific processor chip. Rather, the Java source code is compiled for a “virtual machine” – that
is, for a kind of “generic” processor that doesn’t actually exist in any hardware form. Thus, the
code that results from the Java compiler is not really executable binary code yet; rather, it
represents an intermediate step called “byte code.”
PROCESSING
The complexity of the procedure used to enter a program, compile it, and execute it— commonly
known as the programming environment — is often a hindrance to learning how to program. In
a command line environment, one may have to learn a collection of commands for the operating
system (e.g., Unix) that is being used and, in addition, editor commands for entering and
modifying the program. However, a variety of integrated development environments (IDEs)
such as Visual Studio developed by Microsoft and the popular open-source IDE Eclipse are
available that make this considerably easier. Typically, one first creates a new project, specifying
the programming language being used, and then perhaps adds some libraries to the project,
maybe rearranges some windows and creates a package and a text file, enters the source code
for the program in this file and saves it, and finally builds the project. The resulting object program
can be executed.
However, the Processing environment is one of the simplest to use. When it is started, a simple
"sketch window" appears that has six buttons at the top, a program editor window below this, and
a text output window at the bottom, as pictured on the right in Figure 1-1.
The program editor currently contains no program code — i.e., it contains an empty
program. But if we click the leftmost (Run) button at the top of the sketch window, another
window (pictured on the left in Figure 1.1) appears. It is called the visual output window because
it contains graphical output produced by a program. In this example, no output is displayed
because the program is empty.
A program without any useful code is not very interesting, so we add this line:
println("Hello World!");
as shown in Figure 1-2. Once again, we click the Run button. The output:
Hello World!
appears in the text output window. (No output appears in the visual output window because we
are outputting text, not graphics.)
To obtain a simple graphic representation in addition to the text output, we could add the line
as shown in Figure 1-4 can be added to fill the background with one color (black as specified by
the color value 0, which indicates no light) and the circle with another (blue as specified by the
red-greenblue color triple indicating no red, no green and full intensity, 255, blue).
3 The coordinate system used in the visual output window has the origin (0, 0) at the upper left
corner, the positive x-axis directed to the right, and the positive y-axis directed downward.
If we prefer to have a more realistic picture of the earth, we need only download an image file of
the Earth into the folder containing our program and modify the program as shown in Figure 1-5.
The preceding examples are intended to provide a first exposure to the Processing environment
and to give you some indication of how easy it is to do some exciting programming in it. We will
explain the code in these examples and much more in the chapters that follow.
DEFINITION: Define the following terms below. Write your answer on the space provided.
1. Processor -
2. Integrated Development Environments (IDEs) -
3. Programming Languages -
4. World Wide Web (www) -
5. Java -
XII. OUTPUT(RESULT)
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
XIII. EVALUATION
Instruction: Multiple Choice. Choose the letter of the word that best matches the definition stated
on each number. Write your answer on the right column blank. 1 point each.
1. in a typical high-level programming language, an instruction to
print the greeting “Hello!” on the screen could be something like:
1. __________
a) print(“Hello!”) c) print Hello
b) Hello d) None of the above
5. the “source code” of that software program are fed into a special
translation software program called a:
5. __________
a) Compiler c) IDE
b) Software d) None of the above
Rubrics:
II. SUBJECT MATTER: LESSON 5. Key components of a computer systems and computer
software.
Time frame: 9 hours (3 parts)
IV. ENGAGEMENT:
2. Digital Computer
This computer accepts digital input and provides digital output after
processing information and the operation are in a binary system of
0 and 1. By manipulating the binary digits and numbers it can
perform any task like analyze data, mathematical calculations etc.
Examples of digital computers are Apple Macintosh, IBM PC.
3. Hybrid Computer
This computer is the combination of both analog and digital computers in
terms of speed and accuracy. Hybrid computers can measure physical and
digital quantities. Examples of the hybrid computer are the machine measure
heartbeat in hospital, devices installed fuel pumps.
TYPES OF COMPUTER
There are many types of computers, some of which are given below:
1. Super Computer
The fastest and most powerful type of computer Supercomputers are
very expensive and are employed for specialized applications that
require immense amounts of mathematical calculations. For
example, weather forecasting requires a supercomputer. Other uses
of supercomputers include animated graphics, fluid dynamic
calculations, nuclear energy research, and petroleum exploration.
2. Mainframe Computer
A very large and expensive computer that is capable of supporting hundreds or
even thousands of users simultaneously. In the hierarchy that starts with a
simple microprocessor (in watches, for example) at the bottom and moves to
supercomputers at the top, mainframes are just below supercomputers. In
some ways, mainframes are more powerful than supercomputers because they
support more simultaneous programs. But supercomputers can execute a
single program faster than a mainframe.
3. Mini Computer
A midsized computer called minicomputers lie between workstations and
mainframes. In the past decade, the difference between large minicomputers and
small mainframes has blurred, however, as has the distinction between small
minicomputers and workstations. But in general, a minicomputer is a
multiprocessing system capable of supporting from 4 to about 200 users
simultaneously.
ENUMERATION: List down what is asked in the given statement. Write your answer on the
space provided.
B. Digital Computer
6. _________________________
7. _________________________
8. _________________________
9. _________________________
10. _________________________
C. Hybrid Computer
11. _________________________
12. _________________________
13. _________________________
14. _________________________
15. _________________________
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
Instruction: Multiple Choice. Choose the letter of the word that best matches the definition stated
on each number. Write your answer on the right column blank. 1 point each.
1. The first generation of computers used vacuum tubes as a
major piece of technology.
1. _________
a) First Generation c) Third Generation
b) Second Generation d) Fourth Generation
2. This computer accepts digital input and provides digital output
after processing information and the operation are in a binary
system of 0 and 1.
2. _________
a) Digital Computer c) Analog Computers
b) Hybrid Computer d) Super Computer
3. It accepts analog input and provides analog output information.
3. _________
a) Analog Computers c) Digital Computer
b) Hybrid Computer d) Super Computer
5. _________
a) Super Computer c) Digital Computer
b) Analog Computers d) Hybrid Computer
6. A very large and expensive computer that is capable of
supporting hundreds or even thousands of users
simultaneously.
6. _________
a) Mainframe Computer c) Digital Computer
b) Analog Computers d) Hybrid Computer
7. A midsized computer.
7. _________
a) Mini Computer c) Digital Computer
b) Analog Computers d) Hybrid Computer
8. a personal or micro-mini computer sufficient to fit on a desk.
8. _________
a) Desktop Computer c) Digital Computer
b) Analog Computers d) Hybrid Computer
9. a portable computer complete with an integrated screen and
keyboard.
9. _________
a) Laptop Computer c) Digital Computer
b) Analog Computers d) Hybrid Computer
10. Palmtops have no keyboard but the screen serves both as an
input and output device.
10. _________
a) Hand-sized Computer c) Digital Computer
b) Hybrid Computer d) Super Computer
2. Storage Operation
The given information stored in a computer using the different storage devices i.e. central
process unit and auxiliary memory. The auxiliary memory is also known as secondary or
external storage have hard devices for example Floppy, Hard Disk, Compact Disk and Flash
Drive. These different storage devices have both advantages and disadvantages. Auxiliary
storage speeds up information and stores it long term and permanent.
3. Processing Operation
It is considered the basic computing operation. It executes the instructions, control storage
data and input or output devices attached the computer.
● Main memory
● Secondary memory
● Slots /Ports
● Buses
These parts are typically accommodated within the laptop or desktop unit itself, except for the
desktop keyboard and mouse. What is likely the most important piece of hardware is the
microprocessor chip known as the central processing unit (CPU).
2. Storage Devices
board on Apple computers. The motherboard is a printed circuit board and foundation of
a computer that is the biggest board in a computer chassis. It allocates power and allows
communication to and between the CPU, RAM, and all other computer hardware
components. A motherboard provides connectivity between the hardware components of
a computer, like the processor (CPU), memory (RAM), hard drive, and video card. There
are multiple types of motherboards, designed to fit different types and sizes of computers.
1. System Software - This is also commonly known as an operating system (OS). The
system manages other software and devices inside the computer. In a typical setup, the
operating system is like the motherboard software. It is the first thing that is installed,
followed by applications and utility software. Three popular operating systems for
traditional computers include Windows, Mac OS X, and Linux. Popular mobile operating
systems include Android OS, iPhone OS, Windows Phone OS, and Firefox OS.
2. Application Software - This is designed for end users. This software is meant to perform
a specialized assignment and output useful information. An example would be a word
processing application that one uses to compose a letter or a brochure, such as Microsoft
Word.
IDENTIFICATION: Identify the following parts of a motherboard below. Write your answer on the
space provided.
IX. OUTPUT(RESULT)
Submit your output in hard copy or soft copy @ our group chat messenger /
salazarjoshuaanuada@gmail.com.
IDENTIFICATION: Identify the following computer parts below. Write your answer on the space
provided.
1.
1. ___________________________
2.
2. ___________________________
3. ___________________________
4.
4. ___________________________
5.
5. ___________________________
6.
6. ___________________________
7.
7. ___________________________
8.
8. ___________________________
9.
9. ___________________________
10. ___________________________
Rubrics:
Scor Grade Equivalent
Criteria
e
10 OUTSTANDING
100%
9 OUTSTANDING
96%
8 EXCELLENT
92%
7 EXCELLENT
88%
6 VERY GOOD
84%
5 VERY GOOD
80%
4 AVERAGE
76%
3 AVERAGE
72%
2 POOR
68%
1 POOR
64%
0 NEEDS COUNSELLING
60%
REFERENCES
https://www.computerhope.com/jargon/m/mouse.htm
https://www.researchgate.net/publication/324528000
Checked by:
Approved by.
Noted by: