Java Programming
Object-Oriented Programming:
Object-oriented programming is at the core of Java. In fact, all Java programs are object-
oriented this isn’t an option the way that it is Therefore, this chapter begins with a discussion
of the theoretical aspects of OOP.
Two Paradigms of Programming:
As you know, all computer programs consist of two elements: code and data. Furthermore,a
program can be conceptually organized around its code or around its data. That is, some
programs are writtenhappening‖aroundand―whatothersis are wri affected.‖ These are the
two paradigms that g
The first way is called the process-oriented model. This approach characterizes a program
as a series of linear steps (that is, code). The process-oriented model can be thought of as
code acting on data. Procedural languages such as C employ this model to considerable
success. Problems with this approach appear as programs grow larger and more complex.
To manage increasing complexity, the second approach, called object-oriented
programming, was conceived.
Object-oriented programming organizes a program around its data (that is, objects) and a
set of well-defined interfaces to that data. An object-oriented program can be characterized
as data controlling access to code. As you will see, by switching the controlling entity to data,
you can achieve several organizational benefits.
Procedure oriented Programming:
In this approach, the problem is always considered as a sequence of tasks to be done. A
number of functions are written to accomplish these attention on data.
There are many high level languages like COBOL, FORTRAN, PASCAL, C used for
conventional programming commonly known as POP.
POP basically consists of writing a list of instructions for the computer to follow, and
organizing these instructions into groups known as functions.
A typical POP structure is shown in below: Normally a flowchart is used to organize these
actions and represent the flow of control logically sequential flow from one to another. In a
multi-function program, many important data items are placed as global so that they may be
accessed by all the functions. Each function may have its own local data. Global data are
more vulnerable to an in advent change by a function. In a large program it is very difficult to
identify what data is used by which function. In case we need to revise an external data
structure, we should also revise all the functions that access the data. This provides an
opportunity for bugs to creep in.
Drawback: It does not model real world problems very well, because functions are action
oriented and do not really corresponding to the elements of the problem.
Characteristics of POP:
Emphasis is on doing actions.
Large programs are divided into smaller programs known as functions.
Most of the functions shared global data.
Data move openly around the program from function to function.
Functions transform data from one form to another.
Employs top-down approach in program design.
OOP:
OOP allows us to decompose a problem into a number of entities called objects and then
builds data and methods around these entities.
DEF: OOP is an approach that provides a way of modularizing programs by creating
portioned memory area for both data and methods that can used as templates for creating
copies of such modules on demand.
That is ,an object a considered to be a partitioned area of computer memory that stores data
and set of operations that can access that data. Since the memory partitions are
independent, the objects can be used in a variety of different programs without
modifications.
OOP Chars:
Emphasis on data .
Programs are divided into what are known as methods.
Data structures are designed such that they characterize the objects.
Methods that operate on the data of an object are tied together .
Data is hidden.
Objects can communicate with each other through methods.
Reusability.
Follows bottom-up approach in program design.
Evolution of Computing and Programming: Computer use is increasing in almost every field
of endeavor. Computing costs have been decreasing dramatically due to rapid
developments in both hardware and software technologies. Computers that might have filled
large rooms and cost millions of dollars decades ago can now be inscribed on silicon chips
smaller than a fingernail, costing perhaps a few dollars each. Fortunately, silicon is one of
the most abundant materials on earth it is an ingredient in common sand. Silicon chip
technology has made computing so economical that about a billion general-purpose
computers are in use worldwide, helping people in business, industry and government, and
in their personal lives. The number could easily double in the next few years. Over the years,
many programmers learned the programming methodology called structured programming.
You will learn structured programming and an exciting newer methodology, object-oriented
programming. Why do we teach both? Object orientation is the key programming
methodology used by programmers today. You will create and work with many software
objects in this text. But you will discover that their internal structure is often built using
structured-programming techniques. Also, the logic of manipulating objects is occasionally
expressed with structured programming.
Python Programming
INTRODUCTION DATA, EXPRESSIONS, STATEMENTS
Introduction to Python and installation, data types: Int, float, Boolean, string,
and list;
variables, expressions, statements, precedence of operators, comments;
modules, functions--
- function and its use, flow of execution, parameters and arguments.
Introduction to Python and installation:
Python is a widely used general-purpose, high level programming language. It
was initially
designed by Guido van Rossum in 1991 and developed by Python Software
Foundation. It
was mainly developed for emphasis on code readability, and its syntax allows
programmers
to express concepts in fewer lines of code.
Python is a programming language that lets you work quickly and integrate
systems more
efficiently.
There are two major Python versions- Python 2 and Python 3.
• On 16 October 2000, Python 2.0 was released with many new features.
• On 3rd December 2008, Python 3.0 was released with more testing and
includes new
features.
Beginning with Python programming:
1) Finding an Interpreter:
Before we start Python programming, we need to have an interpreter to interpret
and run our
programs. There are certain online interpreters like
https://ide.geeksforgeeks.org/,
http://ideone.com/ or http://codepad.org/ that can be used to start Python
without installing
an interpreter.
Windows: There are many interpreters available freely to run Python scripts
like IDLE
(Integrated Development Environment) which is installed when you install the
python
software from http://python.org/downloads/
2) Writing first program:
# Script Begins
Statement1
Statement2
Statement3
# Script Ends
Differences between scripting language and programming language:
Why to use Python:
The following are the primary factors to use python in day-to-day life:
1. Python is object-oriented
Structure supports such concepts as polymorphism, operation overloading and
multiple inheritance.
2. Indentation
Indentation is one of the greatest feature in python
3. It’s free (open source)
Downloading python and installing python is free and easy
4. It’s Powerful
Dynamic typing
Built-in types and tools
Library utilities
Third party utilities (e.g. Numeric, NumPy, sciPy)
Automatic memory management
5. It’s Portable
Python runs virtually every major platform used today
As long as you have a compaitable python interpreter installed, python
programs will run in exactly the same manner, irrespective of platform.
6. It’s easy to use and learn
No intermediate compile
Python Programs are compiled automatically to an intermediate form called
byte code, which the interpreter then reads.
This gives python the development speed of an interpreter without the
performance loss inherent in purely interpreted languages.
Structure and syntax are pretty intuitive and easy to grasp.
7. Interpreted Language
Python is processed at runtime by python Interpreter
8. Interactive Programming Language
Users can interact with the python interpreter directly for writing the programs
Artificial Intelligence
Introduction:
Artificial Intelligence is concerned with the design of intelligence in an artificial
device. The term was coined by John McCarthy in 1956.
Intelligence is the ability to acquire, understand and apply the knowledge to
achieve goals in the world.
AI is the study of the mental faculties through the use of computational models
AI is the study of intellectual/mental processes as computational processes.
AI program will demonstrate a high level of intelligence to a degree that equals or
exceeds the intelligence required of a human in performing some task.
AI is unique, sharing borders with Mathematics, Computer Science, Philosophy,
Psychology, Biology, Cognitive Science and many others.
Although there is no clear definition of AI or even Intelligence, it can be described
as an attempt to build machines that like humans can think and act, able to learn and
use knowledge to solve problems on their own.
History of AI:
Important research that laid the groundwork for AI:
In 1931, Goedel layed the foundation of Theoretical Computer Science1920-30s:
He published the first universal formal language and showed that math itself is either
flawed or allows for unprovable but true statements.
In 1936, Turing reformulated Goedel’s result and church’s extension thereof.
Artificial Intelligence Page 6
In 1956, John McCarthy coined the term "Artificial Intelligence" as the topic of the
Dartmouth Conference, the first conference devoted to the subject.
In 1957, The General Problem Solver (GPS) demonstrated by Newell, Shaw &
Simon
In 1958, John McCarthy (MIT) invented the Lisp language.
In 1959, Arthur Samuel (IBM) wrote the first game-playing program, for checkers,
to achieve sufficient skill to challenge a world champion.
In 1963, Ivan Sutherland's MIT dissertation on Sketchpad introduced the idea of
interactive graphics into computing.
In 1966, Ross Quillian (PhD dissertation, Carnegie Inst. of Technology; now CMU)
demonstrated semantic nets
In 1967, Dendral program (Edward Feigenbaum, Joshua Lederberg, Bruce
Buchanan, Georgia Sutherland at Stanford) demonstrated to interpret mass spectra
on organic chemical compounds. First successful knowledge-based program for
scientific reasoning.
In 1967, Doug Engelbart invented the mouse at SRI
In 1968, Marvin Minsky & Seymour Papert publish Perceptrons, demonstrating
limits of simple neural nets.
In 1972, Prolog developed by Alain Colmerauer.
In Mid 80’s, Neural Networks become widely used with the Backpropagation
algorithm (first described by Werbos in 1974).
1990, Major advances in all areas of AI, with significant demonstrations in machine
learning, intelligent tutoring, case-based reasoning, multi-agent planning, scheduling,
uncertain reasoning, data mining, natural language understanding and translation,
vision, virtual reality, games, and other topics.
In 1997, Deep Blue beats the World Chess Champion Kasparov
In 2002,iRobot, founded by researchers at the MIT Artificial Intelligence Lab,
introduced Roomba, a vacuum cleaning robot. By 2006, two million had been sold.
Foundations of Artificial Intelligence:
Philosophy
e.g., foundational issues (can a machine think?), issues of knowledge and believe,
mutual knowledge
Psychology and Cognitive Science
e.g., problem solving skills
Neuro-Science
e.g., brain architecture
Computer Science And Engineering
e.g., complexity theory, algorithms, logic and inference, programming languages,
and system building.
Mathematics and Physics
e.g., statistical modeling, continuous mathematics,
Statistical Physics, and Complex Systems.
Sub Areas of AI:
1) Game Playing Deep Blue Chess program beat world champion Gary Kasparov
2) Speech Recognition PEGASUS spoken language interface to American Airlines'
EAASY SABRE reseration system, which allows users to obtain flight information
and make reservations over the telephone. The 1990s has seen significant advances
in speech recognition so that limited systems are now successful.
3) Computer Vision Face recognition programs in use by banks, government, etc.
The ALVINN system from CMU autonomously drove a van from Washington, D.C. to
San Diego (all but 52 of 2,849 miles), averaging 63 mph day and night, and in all
weather conditions. Handwriting recognition, electronics and manufacturing
inspection, photo interpretation, baggage inspection, reverse engineering to
automatically construct a 3D geometric model.
4) Expert Systems Application-specific systems that rely on obtaining the
knowledge of human experts in an area and programming that knowledge into a
system.
Digital Marketing
Digital Marketing Concept:
Digital Marketing, also called online marketing, is the promotion of brands to connect
with potential customers using the internet and other forms of digital communication.
This includes not only email, social media, and web-based advertising, but also text
and multimedia messages as a marketing channel.
If a marketing campaign involves digital communication, it's digital marketing.
Components of digital marketing
A) Advertising
Online advertising involves bidding and buying relevant ad units on third-party sites,
such as display ads on blogs, forums, and other relevant websites. Types of ads
include images, text, pop-ups, banners, and video. Retargeting is an important
aspect of online advertising. Retargeting requires code that adds an anonymous
browser cookie to track new visitors to your site. Then, as that visitor goes to other
sites, you can serve them ads for your product or service. This focuses your
advertising efforts on people who have already shown interest in your company.
B) Content marketing
Content marketing is an important strategy for attracting potential customers.
Publishing a regular cadence of high-quality, relevant content online will help
establish thought leadership. It can educate target customers about the problems
your product can help them resolve, as well as boost SEO rankings. Content can
include blog posts, case studies, whitepapers, and other materials that provide value
to your target audience. These digital content assets can then be used to acquire
customers through organic and paid efforts.
C) Email marketing
Email is a direct marketing method that involves sending promotional messages to a
segmented group of prospects or customers. Email marketing continues to be an
effective approach for sending personalized messages that target customers’ needs
and interests. It is most popular for e-commerce business as a way of staying top of
mind for consumers.
D) Mobile marketing
Mobile marketing is the promotion of products or services specifically via mobile
phones and devices. This includes mobile advertising through text messages or
advertising in downloaded apps. However, a comprehensive mobile marketing
approach also includes optimizing websites, landing pages, emails, and content for
an optimal experience on mobile devices.
E) Paid search
Paid search increases search engine visibility by allowing companies to bid for
certain keywords and purchase advertising space in the search engine results. Ads
are only shown to users who are actively searching for the keywords you have
selected. There are two main types of paid search advertising — pay per click (PPC)
and cost per mile (CPM). With PPC, you only pay when someone clicks on your Ad.
With CPM, you pay based on the number of impressions. Google Adwords is the
most widely used paid search advertising platform; however, other search engines
like Bing also have paid programs.
F) Programmatic advertising
Programmatic advertising is an automated way of bidding for digital advertising.
Each time someone visits a web page, profile data is used to auction the ad
impression to competing advertisers. Programmatic advertising provides greater
control over what sites your advertisements are displayed on and who is seeing
them so you can better target your campaigns.
G) Reputation marketing
Reputation marketing focuses on gathering and promoting positive online reviews.
Reading online reviews can influence customer buying decisions and is an important
component of your overall brand and product reputation. An online reputation
marketing strategy encourages customers to leave positive reviews on sites where
potential customers search for reviews. Many of these review sites also offer native
advertising that allows companies to place ads on competitor profiles.
H) Search engine optimization
Search engine optimization (SEO) focuses on improving organic traffic to your
website. SEO activities encompass technical and creative tactics to improve
rankings and increase awareness in search engines. The most widely used search
engines include Google, Bing, and Yahoo. Digital marketing managers focus on
optimizing levers — such as keywords, cross links, back links, and original content
— to maintain a strong ranking.
I) Social media marketing
Social media marketing is a key component of digital marketing. Platforms such as
Facebook, Twitter, Pinterest, Instagram, Tumblr, LinkedIn, and even YouTube
provide digital marketing managers with paid opportunities to reach and interact with
potential customers. Digital marketing campaigns often combine organic efforts with
sponsored content and paid advertising promotions on key social media channels to
reach a larger audience and increase brand lift.
J) Video marketing
Video marketing enables companies to connect with customers in a more visually
engaging and interactive way. You can showcase product launches, events, and
special announcements, as well as provide educational content and testimonies.
YouTube and Vimeo are the most commonly used platforms for sharing and
advertising videos. Pre-roll ads (which are shown for the first 5–10 seconds before a
video) are another way digital marketing managers can reach audiences on video
platforms.
K) Web analytics
Analytics allow marketing managers to track online user activity. Capturing and
analyzing this data is foundational to digital marketing because it gives companies
insights into online customer behavior and their preferences. The most widely used
tool for analyzing website traffic is Google Analytics, however other tools include
Adobe Analytics, Coremetrics, Crazy Egg, and more.
Data Mining & Network Security
Introduction to Data Warehouse:
A data warehouse is a subject-oriented, integrated, time-variant and non-volatile
collection of data in support of management's decision making process.
Subject-Oriented: A data warehouse can be used to analyze a particular subject
area. For example, "sales" can be a particular subject.
Integrated: A data warehouse integrates data from multiple data sources. For
example, source A and source B may have different ways of identifying a product,
but in a data warehouse, there will be only a single way of identifying a product.
Time-Variant: Historical data is kept in a data warehouse. For example, one can
retrieve data from 3 months, 6 months, 12 months, or even older data from a data
warehouse. This contrasts with a transactions system, where often only the most
recent data is kept. For example, a transaction system may hold the most recent
address of a customer, where a data warehouse can hold all addresses associated
with a customer.
Non-volatile: Once data is in the data warehouse, it will not change. So, historical
data in a data warehouse should never be altered.
Data Warehouse Design Process:
A data warehouse can be built using a top-down approach, a bottom-up approach, or
a combination of both.
The top-down approach starts with the overall design and planning. It is useful in
cases where the technology is mature and well known, and where the business
problems that must be solved are clear and well understood.
The bottom-up approach starts with experiments and prototypes. This is useful in the
early stage of business modeling and technology development. It allows an
organization to move forward at considerably less expense and to evaluate the
benefits of the technology before making significant commitments.
In the combined approach, an organization can exploit the planned and strategic
nature of the top-down approach while retaining the rapid implementation and
opportunistic application of the bottom-up approach.
The warehouse design process consists of the following steps:
Choose a business process to model, for example, orders, invoices, shipments,
inventory, account administration, sales, or the general ledger. If the business
process is organizational and involves multiple complex object collections, a data
warehouse model should be followed. However, if the process is departmental and
focuses on the analysis of one kind of business process, a data mart model should
be chosen.
Choose the grain of the business process. The grain is the fundamental, atomic level
of data to be represented in the fact table for this process, for example, individual
transactions, individual daily snapshots, and so on.
Choose the dimensions that will apply to each fact table record. Typical dimensions
are time, item, customer, supplier, warehouse, transaction type, and status.
Choose the measures that will populate each fact table record. Typical measures are
numeric additive quantities like dollars sold and units sold.
A Three Tier Data Warehouse Architecture:
Embedding for Databases) by Microsoft and JDBC (Java Database Connection).
This tier also contains a metadata repository, which stores information about the
data warehouse and its contents.
Tier-1:
The bottom tier is a warehouse database server that is almost always a relational
database system. Back-end tools and utilities are used to feed data into the bottom
tier from operational databases or other external sources (such as customer profile
information provided by external consultants). These tools and utilities perform data
extraction, cleaning, and transformation (e.g., to merge similar data from different
sources into a unified format), as well as load and refresh functions to update the
data warehouse. The data are extracted using application program interfaces known
as gateways. A gateway is supported by the underlying DBMS and allows client
programs to generate SQL code to be executed at a server. Examples of gateways
include ODBC (Open Database Connection) and OLEDB (Open Linking and
Tier-2:
The middle tier is an OLAP server that is typically implemented using either a
relational OLAP (ROLAP) model or a multidimensional OLAP. OLAP model is an
extended relational DBMS thatmaps operations on multidimensional data to standard
relational operations. A multidimensional OLAP (MOLAP) model, that is, a special-
purpose server that directly implements multidimensional data and operations.
Tier-3:
The top tier is a front-end client layer, which contains query and reporting tools,
analysis tools, and/or data mining tools (e.g., trend analysis, prediction, and so on).
Data Warehouse Models:
There are three data warehouse models.
1. Enterprise warehouse:
An enterprise warehouse collects all of the information about subjects spanning the
entire organization.
It provides corporate-wide data integration, usually from one or more operational
systems or external information providers, and is cross-functional in scope.
It typically contains detailed data as well as summarized data, and can range in size
from a few gigabytes to hundreds of gigabytes, terabytes, or beyond.
An enterprise data warehouse may be implemented on traditional mainframes,
computer super servers, or parallel architecture platforms. It requires extensive
business modeling and may take years to design and build.
2. Data mart:
A data mart contains a subset of corporate-wide data that is of value to a specific group of users. The
scope is confined to specific selected subjects. For example, a marketing data mart may confine its
subjects to customer, item, and sales. The data contained in data marts tend to be summarized.
Data Communication & Computer Network
1.1 Data Communication:
When we communicate, we are sharing information. This sharing can be
local or remote. Between individuals, local communication usually occurs
face to face, while remote communication takes place over distance.
1.1.1 Components: A data communications system has five
components.
1. Message. The message is the information (data) to be communicated.
Popular forms of information include text, numbers, pictures, audio, and
video.
2. Sender. The sender is the device that sends the data message. It can
be a computer, workstation, telephone handset, video camera, and so
on.
3. Receiver. The receiver is the device that receives the message. It can
be a computer, workstation, telephone handset, television, and so on.
4. Transmission medium. The transmission medium is the physical path
by which a message travels from sender to receiver. Some examples of
transmission media include twisted-pair wire, coaxial cable, fiber-optic
cable, and radio waves
5. Protocol. A protocol is a set of rules that govern data communications.
It represents an agreement between the communicating devices.
Without a protocol, two devices may be connected but not
communicating, just as a person speaking French cannot be understood
by a person who speaks only Japanese.
1.1.2 Data Representation: Information today comes in different forms
such as text, numbers, images, audio, and video.
Text:
In data communications, text is represented as a bit pattern, a
sequence of bits (Os or Is). Different sets of bit patterns have been
designed to represent text symbols. Each set is called a code, and the
process of representing symbols is called coding. Today, the prevalent
coding system is called Unicode, which uses 32 bits to represent a
symbol or character used in any language in the world. The American
Standard Code for Information Interchange (ASCII), developed some
decades ago in the United States, now constitutes the first 127
characters in Unicode and is also referred to as Basic Latin.
Numbers:
Numbers are also represented by bit patterns. However, a code such as
ASCII is not used to represent numbers; the number is directly converted
to a binary number to simplify mathematical operations. Appendix B
discusses several different numbering systems.
Images:
Images are also represented by bit patterns. In its simplest form, an
image is composed of a matrix of pixels (picture elements), where each
pixel is a small dot. The size of the pixel depends on the resolution. For
example, an image can be divided into 1000 pixels or 10,000 pixels. In
the second case, there is a better representation of the image (better
resolution), but more memory is needed to store the image. After an
image is divided into pixels, each pixel is assigned a bit pattern. The size
and the value of the pattern depend on the image. For an image
made of only blackand- white dots (e.g., a chessboard), a I-bit pattern is
enough to represent a pixel. If an image is not made of pure white and
pure black pixels, you can increase the size of the bit pattern to include
gray scale. For example, to show four levels of gray scale, you can use
2-bit patterns. A black pixel can be represented by 00, a dark gray pixel
by 01, a light gray pixel by 10, and a white pixel by 11. There are several
methods to represent color images. One method is called RGB, so
called because each color is made of a combination of three primary
colors: red, green, and blue. The intensity of each color is measured,
and a bit pattern is assigned to it. Another method is called YCM, in
which a color is made of a combination of three other primary colors:
yellow, cyan, and magenta.
Audio: Audio refers to the recording or broadcasting of sound or music.
Audio is by nature different from text, numbers, or images. It is
continuous, not discrete. Even when we use a microphone to change
voice or music to an electric signal, we create a continuous signal. In
Chapters 4 and 5, we learn how to change sound or music to a digital or
an analog signal.
Video:
Video refers to the recording or broadcasting of a picture or movie. Video can
either be produced as a continuous entity (e.g., by a TV camera), or it can be a
combination of images, each a discrete entity, arranged to convey the idea of
motion. Again we can change video to a digital or an analog signal.