INTRODUCTION TO
PROGRAMMING
CONCEPTS
What is a Computer?
Components of a Computer
WEEK
Hardware Concepts
1-2
Digital Data Representation
Bits and Bytes
Digital Electronics
Software Concepts
Software Basics
The Evolution of Programming Languages
an electronic device capable of performing complex
computations in a short time
a fast electronic calculating machine that accepts
input information, processes it according to a list of
internally stored instructions called a program, and
produces the resultant output information
What is a
Computer? a program is a set of instructions for a computer,
telling it what to do or how to behave
programming is the craft of implementing one or
more interrelated abstract algorithms using a
particular programming language to produce a
concrete computer program
Hardware
the physical equipment of a computer
system, including the monitor, keyboard,
central processing unit, and storage
Components devices
of a
Computer Software
refers to one or more computer
programs and data held in the storage of
a computer for some purpose
Data representation: conversion of images,
letters and sounds into electrical signals
Hardware
Concepts
Digital electronics: manipulation of “on” and
“off” signals to perform complex tasks
Signals: variations in physical
phenomena
Hardware
Analog: any continuous signal
Concepts
Digital: representation of a sequence
of distinct values
The “0” and “1” are also known as bits
or binary digits
Digital Data A sequence of “0” and “1” could be
Representation written as 10011, this sequence may
have significant value, for instance it
may pertain to a certain character or
number
How can a computer represent numbers using
bits?
Numeric data like your age, salary, electricity
bill are easily understood by humans. Then,
how is it possible for a computer to show a
Digital Data number using only binary digits?
Representation
The computer uses the binary number
system which uses only two digits: 0 and 1
A series of 0’s and 1’s results in a particular
number much in the same way we use the
decimal number system
This table shows how the binary system
works
Decimal (Base 10) Binary (Base 2)
0 0000
1 0001
Digital Data 2 0010
Representation 3 0011
4 0100
5 0101
6 0110
7 0111
8 1000
9 1001
How come 5 become 0101 when converted
to binary?
Notice that there are 4 digits, namely, 0 , 1 , 0
and 1
The positions of the 0’s and 1’s hold a significant
values
Digital Data
Representation
If the box contains 0, the value is turned
“off”
If the box contains 1, the value of that
box is turned “on” and will be added to
the value of the number
Digital Data
Representation
How then can a computer represent words
and letters using bits?
Bits can also be used to represent
character data synonymous to using
Morse code
Digital Data
In this case, computers makes use of 0
Representation and 1 as a replacement to dashes and dots
Example: ME has a corresponding binary
value
ASCII (American Standard Code for
Information Interchange)
requires only seven bits for each
character
EBCDIC (Extended Binary Coded
Decimal Interchange Code)
Digital Data an alternative 8-bit code used by older
Representation IBM mainframe computers
UNICODE
uses 8, 16, or 32 bits providing codes for
65,000 characters (represent the
alphabets of multiple languages) and
becoming popular
Extended ASCII Code
makes use of a series of 0’s and 1’s to represent
256 characters
Digital Data
Representation
BIT
an abbreviation of binary digit
other abbreviation for bit is the
lowercase “b”
Bits and
Bytes
BYTE
a collection of bits (8 to be exact)
usually abbreviated as an uppercase “B”
PREFIX ABBREVIATION VALUE
Kilo K 210
Bits and
Mega M 220
Bytes Giga G 230
Tera T 240
How are bits stored and transferred from one
point to another?
Bits take the form of electrical pulses
Digital traveling over the circuits
Electronics
All circuits, chips, and mechanical
components forming a computer are
designed to work with bits
Computer Diagram
Processor
or CPU
Digital Input
Computer
Output
Devices Devices
Electronics
Memory
Input devices are machines that generate
input for the computer, such as keyboard and
mouse
Processor or CPU (Central Processing Unit) is
the central electronic chip that controls the
processes in the computer. It determines the
Digital processing power of the computer
Electronics
Output devices are machines that display
information from the computer, such as
monitor, speaker, and printer
Memory is the part of the computer that
stores applications, documents, and systems
operating information
What is a software?
computer instructions or data
Software anything that can be stored electronically
Concepts
computer programs, modules (support and
data) working together providing computers
with instructions and data for certain task
(e.g. word processing, internet browsing)
Computer program (or “program”)
an organized list of instructions that, when
executed, causes the computer to behave in a
predetermined manner
Support module
Software an auxiliary set of instructions used in conjunction
Basics with the main software program
Data module
contains data (not supplied by the user) necessary
for the execution of certain task
Software
Basics
Data vs. Software
Then, the term “software” is always
associated to all non-hardware components
of a computer
Software
Basics
Modern definitions made it clear that all
documents, spreadsheets and even
downloaded materials from the net are now
classified as data
Technically, the compiler creates object code, which is
the machine language representation of the source
code, and the linker creates the executable code
The linker combines program object code, object code
from library routines and any other required system
code into one addressable machine language file
Software
Basics
Application Software vs. System Software
Application software are computer programs that are
used to accomplish specific or specialized tasks for
Software computer users such as creating and editing
Basics documents (word processing), making graphic
presentations, or listening to MP3 music
System software helps the computer carry out its
basic operating functions
computer user instructs
the application program
to print a document
Application
Computer User Software
Software
(Microsoft Word)
Basics
application software
system software controls makes a request to the
the printer as the system software to print
document is printed the document
System Software
Hardware (Operating System,
(Computer
, Printer) Device Drivers)
First Generation: Machine Languages – use a binary code
(strings of 1s and 0s) that can be understood by the computer
and executed directly without any need for translation
Late 1940s to early 1950s
Uses binary code (zeroes & ones)
Evolution of Very difficult to learn and use
Programming
Languages Machine-dependent
Second Generation: Assembly Languages – use mnemonics
(very short words) for commands
Early to mid-1950s
Assembly language
Evolution of
Programming Slight improvement over machine
Languages language, but still difficult to learn and use
Needs to be converted to machine
language by an assembler
Machine-dependent
Together with 1GL, are considered as
low-level languages
Third Generation: High-Level Languages – use data
structures and control structures that are abstractions of
programming concepts
Began during mid-to-late 1950s
High-level programming languages
Evolution of
Closer to level of human languages
Programming
Languages High-level data structures and control
structures
Machine-independent or portable
Needs to be compiled into object code
FORTRAN, Algol, BASIC, Pascal, C
Fourth Generation: Declarative Languages – also called
“non-procedural specification languages”; a programmer
who writes 4GL programs concentrates on what needs to
be done (result/output) rather than how to do it
(steps/process)
Brought about by the “programming crisis” of
the early 1970s
Evolution of
Programming Declarative or non-procedural
Languages specification languages
English-like
“What” oriented, rather than “how” oriented
Standard ML, Lisp, Haskel, SQL, Oracle Designer
& Developer, VB
Fifth Generation: AI – problem-solving based on
constraints or rules that have been declared in the program
From 1990s onwards
Outgrowth of artificial intelligence (AI) research
Constraint/rule-based programming, rather than
Evolution of algorithm-based problem-solving
Programming Focus is on making the program solve the problem
Languages for you, rather than specifying the actual problem-
solving steps
Prolog, OPS5, and Mercury