Fiance Fraud Detection
Fiance Fraud Detection
Fiance Fraud Detection
MACHINE LEARNING
ABSTRACT:
Financial fraud, such as money laundering, is known to be a serious process of crime that
makes illegitimately obtained funds go to terrorism or other criminal activity. This kind of
illegal activities involve complex networks of trade and financial transactions, which makes it
difficult to detect the fraud entities and discover the features of fraud. Fortunately,
trading/transaction network and features of entities in the network can be constructed from
the complex networks of the trade and financial transactions. The trading/transaction network
reveals the interaction between entities, and thus anomaly detection on trading networks can
reveal the entities involved in the fraud activity; while features of entities are the description
of entities, and anomaly detection on features can reect details of the fraud activities. Thus,
network and features provide complementary information for fraud detection, which has
potential to improve fraud detection performance. However, the majority of existing methods
focus on networks or features information separately, which does not utilize both
information. In this paper, we propose a novel fraud detection framework, CoDetect, which
can leverage both network information and feature information for financial fraud detection.
In addition, the CoDetect can simultaneously detecting financial fraud activities and the
feature patterns associated with the fraud activities. Extensive experiments on both synthetic
data and real-world data demonstrate the efficiency and the effectiveness of the proposed
framework in combating financial fraud, especially for money laundering.
CHAPTER 1
INTRODUCTION
It`s very difficult for the people to differentiate the real and the fake currencies. In
Bangladesh, fake notes are available. In fact, the bank workers cannot distinguish the
difference between the real and the fake currencies because they check the notes in bare eyes
and so their perception can be wrong sometimes. For this ordinary people can be cheated
anytime. And also this causes an increment of doing many criminal offences. To solve this
major problem, a system is developed that will detect the currency whether it is fake or real.
For example, Bangladeshi currency is used for analysis. The system presents the image
processing system containing all the necessary processes such as image scanning and reading,
image pre-processing, segmentation, feature extraction etc. along with a code that is done in
MATLAB software. So this system is represented as an identifier of the fake and real
currencies. MATLAB is the computational tool of choice for research development and
analysis. The image formats supported by BMP, HDF, JPEG, PCX, TIFF, XWV, PNG and so
on.
Back Ground
The relevance of a project shows how efficient the output of the project which is to be
specified by the evaluation or by using the project which were being submitted. That means
the relevance of a project is always to be related to some goal and, for general research
projects, such goal is mostly increase of our scientific knowledge, although it is sometimes
also related to more direct social environmental benefits for our society. In this proposed
system, our relevance is to focus on detection of fake currencies which is spreader in Indian
market also our main goal is to use image processing technique and recognize original
currency. Relevance of our project is similar to currency recognition system using neural
networks. That identifies, and extracts robust features from banknotes. Now a days many
ways are found for currency recognition also most of the people manually recognizes
currency which is sometimes mistakenly not recognize by some people to build a system so
the system can recognize all kind of currencies which is useful to common man
Problem Statement
Existing systems uses optoelectronic device to produce the signal from the light
refracted by the banknote. There are many currency recognition machines are available in
current market through which currency can be recognize whether by using image processing
technique or neural networks. Existing currency recognition systems are mainly based on
processing of image using image processing techniques and neural networks. Some system
uses Gaussian function in hidden layer and output layer of NN in the place of sigmoid
function. System shown that the Gaussian function is more effective than sigmoid function
for the recognition of known features and rejection of unknown patterns
OBJECTIVES OF APPLICATION
Problem Description
The proposed web portal will help common people for currency recognition anywhere
anytime. Automatic method for detection of fake currency note is very important in every
country. In this approach system extract the general attributes of the paper currency like
various dominant parts of image of currency note (like identification marks, latent image,
etc). The identification marks helps to know the denomination of currency. These marks of
currency helps to detect fake or genuine. The system will be developed to check different
currency notes of 100, 500 and 1000 rupees. The Web Application will display currency
denomination and either currency is genuine or fake. The system simply extracts feature of
currency which were match with original currency features and immediately displays result
with Modules Description
Latent image
Currency Value Area (100,500 & old 1000)
Intaglio printing
Identification mark
See through register
Reserve Bank of India at top of currency note
Reserve Bank of India logo at right corner
SYSTEM ANALYSIS
Existing System
In recent years, financial fraud activities such as credit card fraud, money laundering,
increase gradually. These activities cause the loss of personal and/or enterprises’ properties.
Even worse, the endanger the security of nation because the profit from fraud may go to
terrorism Thus, accurately detecting financial fraud and tracing fraud are necessary and
urgent. However, financial fraud detection is not an easy task due to the complex trading
networks and transactions involved. Taking money laundering as an example, money
laundering is defined as the process of using trades to move money/goods with the intent of
obscuring the true origin of funds. Usually, the prices, quantity or quality of goods on an
invoice of money laundering are fake purposely. The misrepresentation of prices, quantity or
quality of goods on an invoice merely exposes light difference from regular basis if we use
the renumbers as features to generate detection policy
If you lose the offline wallet, the coins are lost forever
It is still not yet accepted by a lot of businesses as a means of making payment.
Online coins can be hacked. Once hacked coins are lost for life.
EXISTING ALGORITHM
Advantages
Proposed Algorithm
The first decentralized crypto currency was created in 2009 called Bitcoin2. In 2010 price for
one bit coin never reached above one dollar3, but in May 2018 it was the one with the highest
market cap of around 149 billion dollars with a little more than 17 million bit coins in
circulation4. This makes one bit coin worth around 8700 dollars (Market cap / circulation).
Bitcoin is a digital currency, which can be used to buy or trade things electronically, however
what makes it different from normal currency is the decentralization of the system by using
blockchain technology5. By being decentralized, it has no single entity controlling the
network but instead maintained by a group of volunteers running on different computers
around the world. Physical currency such as USD can have unlimited supply of currency, if
the government decides to print more, which can change the value of the currency related to
others. However, the increase of Bitcoin circulation is heavily guarded by an algorithm,
which allows only a few Bitcoin to be created every hour by miners until it reaches a total cap
of 21 million bitcoins6.
The decentralization of the system in theory allows for anonymity, since there are no bank or
centralized entity to confirm and validate transactions that are happening. In practice
however, when a transaction is made, each user uses their identifier, known as addresses
during the transaction. These addresses are not associated with a name, but because the
transaction must be transparent because of the way decentralization of the system works, it is
possible for everyone to see it. A disadvantage of the system is that, in case of losing the
address, forgetting it or wanting to reverse a transaction, because there is no authority
involved, it is not possible to do anything about it.
Web sites – To a degree any websites that has information related to bit coin can be written
here, but most important type of sites would be major media outlets and different government
sites of different countries for any change on regulations.
Internet forums – Bitcointalk.org is created by one of the founder of bit coin, and is one of
the biggest forums on bit coin, however many smaller ones exist as well, whom dedicates
themselves to news and information on bit coin. These sites are mostly guaranteed to have
users who are interested and understand bit coin, which mean most posts will be made by
educated users on the topic and be related to bit coin.
Price collection – To obtain prices on bit coin, sites like Bitcoin exchange exists where it is
possible to buy or sell bit coins. Since every sites have different prices on the bit coin, it is
not possible to guarantee that the biggest site is the one with the cheapest price. According to
article27 the top 3 most recommended is Coin base, for being the biggest, Gemini Exchange
for low fees and Chantelle since they have lesser known crypto currencies.
These social platforms and other data sources will be looked upon and analyzed to see which
would fit best to gather data to use for predicting bit coin prices. Although each platform has
their own demographic, many of them are overlapping, this could be a problem if multiple
platforms were used, giving duplicate data rather than more data. Facebook has the potential
of giving the most data, however since the main demographic probably do not know or use
Bitcoin, it might be mostly data which does not influence Bitcoin price. Twitter and Reddit
on the other hand might have less data to obtain, however more specialized knowledge of the
users might lead to some changes in the Bitcoin price. Telegram being the extreme of them
all, with the least users, but with the highest expertise.
CHAPTER 2
LITERATURE SURVEY
Business Understanding
Bitcoin, the first ever decentralized cryptocurrency, was originally conceived to create a
global currency and payment system that is able to reliably work without requiring a trusted
third party. The main underlying technology that powers Bitcoin, and almost all other
cryptocurrencies today, is known as block -chain (Lewis, 2015). The block-chain requires a
peer-to-peer network. The ADEPT is one of peer-to-peer network which has been developed
by IBM partnership with Samsung (Panikkar, Nair, Brody, & Pureswaran, 2015).
The emerging payment by using crypto currencies accelerating the technology pervades the
positive impact to human life (Bakar & Rosbi, 2017). The individual and business has
adopted this new system in term of transact a money quickly and efficiently over the internet
without to supply the credit cards or banking information and use a traditional payment
system (Ahamad, Nair, & Varghese, 2013). In 2018, the cryptocurrency has a total market
cap of around $800 billion USD in Jan 2018 as reported in the Global chart of total market
capitalization.
The cryptocurrency has been seen as investment tool is not only focusing on investor but the
private investors and brokers also interested to these digital currencies. In this regards, it is
necessary to predict the future value (i.e. forecasting or estimating) of cryptocurrency prices
to make decision in trading or investment. The objective of Level-I was to construct the
forecasting model to predict the future values of cryptocurrencies. The case is financial time-
series prediction. The case integrates knowledge from various sources – Crypto Currencies,
Quantitative Finance and Machine learning. The objective of Level-II was to implement the
model which involves applying “live” model to make a decision making for trading or
investment. To predict the forecasting values of 20 major cryptocurrencies, the methodology
shown in Figure 1 was employed.
Title: Stock Trend Prediction Using Simple Moving Average Supported by News
Classification
Description:
The ability to predict stock trend is crucial for stock investors. Using daily time series data,
one is able to predict the trend with the help of simple moving average technique.
Unfortunately, stock trend is also affected by many factors, one of which is daily news. Daily
news, particularly financial news have a great role in deciding stock trend. Each news has a
sentiment value classified into positive, negative, and neutral sentiment that directly affects
whether the trend goes up or down. It will be useful to combine simple moving average and
news classification to predict stock trend more responsively. It uses machine learning using
artificial neural network to combine the two aspects. The experiment uses approximately one
year’s worth of stock data and financial news. Artificial neural network is able to combine
simple moving average technique and news classification, and the result indicates that
financial news can improve the prediction responsiveness. Stock investment has been
gradually growing popular lately. The increase on the number of stock investors is propelled
by the numerous stock prediction techniques formulated today. Most of these techniques use
time series data. The utilization of these techniques has mainly one objective, which is to
maximize profit and minimize loss.
Drawback:
Advantages:
It combines the machine learning using artificial neural network to combine the two
aspects.
Artificial neural network is able to combine simple moving average technique and
news classification.
It can improve the prediction responsiveness.
Description:
Bitcoin is the latest technological form of digital virtual currency which is based on
cryptography and is a decentralized currency that is used in the recent generation of global
money system. The pseudo-anonymous exchanging of cash over the web became common
and possible by the invention of these cryptocurrencies. Anonymity is an important factor
that keeps the bitcoin more live and interesting even after years. The Bitcoins can be used
without linking any kind of real world identity to it, which makes it different from the normal
online currency. It’s tough to tell who owns it unless linked with owner’s name to the Bitcoin
address. Bitcoin does not monitor clients; it monitors addresses where the cash is and keeps a
public ledger for all the processed transactions . Advanced coinage, for example, the dollar
and the euro are controlled by governments and monetary variables which decide their
quality and their operation. This virtual cryptocurrency was introduced back in the year 2008
and released the software in 2009 by Satoshi Nakamoto . The basic idea of the system is peer-
to-peer, where the transaction is done directly without a middle person. The transactions are
confirmed by network nodes which is stored in a public ledger. This public distributed ledger
is called the block chain and uses bitcoin as the unit. The mining of Bitcoins is an activity in
which the coins are made as a reward for transaction handling process . Here the users offer
their computing energy to confirm and record the payments.
Drawbacks:
Advantages:
Title: A First Estimation of the Proportion of Cybercriminal Entities in the Bitcoin Eco
system using Supervised Machine Learning.
Description:
Bitcoin is a peer-to-peer payment system and digital currency, conceived in 2008 . In 2015,
Bitcoin was estimated to be accepted as a payment method by over 100,000 merchants
around the world . As a recent study showed, Bitcoins popularity, together with other digital
currencies, continues to raise: there are 2.9 to 5.8 million unique users, majority of which are
Bitcoin users . Nonetheless, a significant slice of the Bitcoin ecosystem is, and has been,
associated with illicit activities such as money laundering, cyber-extortion, thievery,
scamming, terror financing or illegal goods trading in the clear through the darknet.Due to
Bitcoins characteristics, especially its pseudo anonymity, the fact that it became the preferred
payment system for illicit activities comes as no surprise. In contrast to other digital payment
methods such as debit or credit card payments, Bitcoin transactions are not linked to real
world identities, but only to public keys or addresses, and the generation of the latter does not
require any verified personal information. Whereas some entities voluntarily reveal their
addresses when it is necessary to provide their services (e.g. donation addresses), others carry
out privacy-enhancing payment schemes or leverage mixing services to obscure their
spending habits. This behaviour is commonly seen in entities that are related to tor markets,
ransom payments, scams, and thievery.
Drawback:
Advantages:
Description:
A revolution of crypto currencies is very trending in the world . In particular, Bitcoin is the
most successful one and can be exchanged with a real currency . Bitcoin is a digital currency
system maintained by users in the P2P (Peer-to-Peer) network and does not need any central
authorities, e.g., bank . Bitcoin is transferred in between Bitcoin addresses via a formatted
message called transaction. The legitimacy of transactions is verified by participating users
and if verified, such transactions are recorded in the everlasting ledger called blockchain
Recently, more and more services accept Bitcoin as a payment method . For instance, a user
can buy some items with Bitcoin through an advanced vending machine. Since it is quite
common to pay with mobile phones in daily life, more mobile Bitcoin client apps will be
available in the near future. However, at the end of January 2017, the size of Bitcoin’s block
chain is more than 100 GB and thus it is not wise for storage constrained devices to keep the
entire block chain. To solve this problem, an SPV (Simplified Payment Verification) client
that does not download the entire block chain has been developed .
Title: A Survey on Security and Privacy Issues of Bitcoin
Description:
SYSTEM SPECIFICATION
Hardware Requirements
Processor : Intel Core processor
Hard disk : 160 GB
RAM : 2 GB
Software Requirements
Operating System : Windows 7 / 8 / 10
Coding Language : JDK 1.8
Framework : Eclipse Mars
Web server : Apache Tomcat 8.0
Data base : MYSQL 5.0
SYSTEM STUDY
PRELIMINARY INVESTIGATION
The first and foremost strategy for development of a project starts from the thought of
designing a mail enabled platform for a small firm in which it is easy and convenient of
sending and receiving messages, there is a search engine ,address book and also including
some entertaining games. When it is approved by the organization and our project guide the
first activity, ie. Preliminary investigation begins. The activity has three parts:
Request Clarification
Feasibility Study
Request Approval
REQUEST CLARIFICATION
After the approval of the request to the organization and project guide, with an
investigation being considered, the project request must be examined to determine precisely
what the system requires. Here our project is basically meant for users within the company
whose systems can be interconnected by the Local Area Network (LAN). In today’s busy
schedule man need everything should be provided in a readymade manner. So taking into
consideration of the vastly use of the net in day to day life, the corresponding development of
the portal came into existence.
FEASIBILITY ANALYSIS
Operational Feasibility
Economic Feasibility
Technical Feasibility
Operational Feasibility
Operational Feasibility deals with the study of prospects of the system to be
developed. This system operationally eliminates all the tensions of the Admin and helps him
in effectively tracking the project progress. This kind of automation will surely reduce the
time and energy, which previously consumed in manual work. Based on the study, the system
is proved to be operationally feasible.
Economic Feasibility
Technical Feasibility
According to Roger S. Pressman, Technical Feasibility is the assessment of the
technical resources of the organization. The organization needs IBM compatible machines
with a graphical web browser connected to the Internet and Intranet. The system is developed
for platform Independent environment. Java Server Pages, JavaScript, HTML, SQL server
and WebLogic Server are used to develop the system. The technical feasibility has been
carried out. The system is technically feasible for development and can be developed with the
existing facility.
REQUEST APPROVAL
Not all request projects are desirable or feasible. Some organization receives so many
project requests from client users that only few of them are pursued. However, those projects
that are both feasible and desirable should be put into schedule. After a project request is
approved, it cost, priority, completion time and personnel requirement is estimated and used
to determine where to add it to any project list. Truly speaking, the approval of those above
factors, development works can be launched.
INPUT DESIGN
Input Design plays a vital role in the life cycle of software development, it requires
very careful attention of developers. The input design is to feed data to the application as
accurate as possible. So inputs are supposed to be designed effectively so that the errors
occurring while feeding are minimized. According to Software Engineering Concepts, the
input forms or screens are designed to provide to have a validation control over the input
limit, range and other related validations.
This system has input screens in almost all the modules. Error messages are
developed to alert the user whenever he commits some mistakes and guides him in the right
way so that invalid entries are not made. Let us see deeply about this under module design.
Input design is the process of converting the user created input into a computer-based
format. The goal of the input design is to make the data entry logical and free from errors.
The error is in the input are controlled by the input design. The application has been
developed in user-friendly manner. The forms have been designed in such a way during the
processing the cursor is placed in the position where must be entered. The user is also
provided with in an option to select an appropriate input from various alternatives related to
the field in certain cases.
Validations are required for each data entered. Whenever a user enters an erroneous
data, error message is displayed and the user can move on to the subsequent pages after
completing all the entries in the current page.
Output Design
The Output from the computer is required to mainly create an efficient method of
communication within the company primarily among the project leader and his team
members, in other words, the administrator and the clients. The output of VPN is the system
which allows the project leader to manage his clients in terms of creating new clients and
assigning new projects to them, maintaining a record of the project validity and providing
folder level access to each client on the user side depending on the projects allotted to him.
After completion of a project, a new project may be assigned to the client. User
authentication procedures are maintained at the initial stages itself. A new user may be
created by the administrator himself or a user can himself register as a new user but the task
of assigning projects and validating a new user rests with the administrator only.
The application starts running when it is executed for the first time. The server has to be
started and then the internet explorer in used as the browser. The project will run on the local
area network so the server machine will serve as the administrator while the other connected
systems can act as the clients. The developed system is highly user friendly and can be easily
understood by anyone using it even for the first time.
JAVA OVERVIEW
Java is a high-level language that can be characterized by all of the following exhortations.
Simple
Object Oriented
Distributed
Multithreaded
Dynamic
Architecture Neutral
Portable
High performance
Robust
Secure
In the Java programming language, all the source code is first written in plain text
files ending with the .java extension. Those source files are then compiled into .class files by
the Java compiler (javac). A class file does not contain code that is native to your processor;
it instead contains byte codes - the machine language of the Java Virtual Machine. The Java
launcher tool (java) then runs your application with an instance of the Java Virtual Machine.
JAVA PLATFORM:
Java Virtual Machine is the base for the java platform and is pored onto various
hardware-based platforms.
The API is a large collection of ready-made software components that provide many
useful capabilities, such as graphical user interface (GUI) widgets. It is grouped into libraries
of related classes and interfaces, these libraries are known as packages.
Development Tools:
The development tools provide everything you’ll need for compiling, running,
monitoring, debugging, and documenting your applications. As a new developer, the main
tools you’ll be using are the Java compiler (javac), the Java launcher (java), and the Java
documentation (javadoc).
The API provides the core functionality of the Java programming language. It offers a
wide array of useful classes ready for use in your own applications. It spans everything from
basic objects, to networking and security.
Deployment Technologies:
The JDK provides standard mechanisms such as Java Web Start and Java Plug-In, for
deploying your applications to end users.
The Swing and Java 2D toolkits make it possible to create sophisticated Graphical
User Interfaces (GUIs).
Drag-and-drop support:
Drag-and-drop is, as its name implies, a two step operation. Code must to facilitate
dragging and code to facilitate dropping. Sun provides two classes to help with this namely
DragSource and DropTarget
Swing defines an abstract Look and Feel class that represents all the information central
to a look-and-feel implementation, such as its name, its description, whether it’s a native
look-and-feel- and in particular, a hash table (known as the “Defaults Table”) for storing
default values for various look-and-feel attributes, such as colors and fonts.
Each look-and-feel implementation defines a subclass of Look And Feel (for example,
swing .plaf.motif.MotifLookAndFeel) to provide Swing with the necessary information to
manage the look-and-feel.
The UIManager is the API through which components and programs access look-and-
feel information (They should rarely, if ever, talk directly to a LookAndFeelinstance).
UIManager is responsible for keeping track of which LookAndFeel classes are available,
which are installed, and which is currently the default. The UIManager also manages access
to the Defaults Table for the current look-and-feel.
When a Swing application programmatically sets the look-and-feel, the ideal place to
do so is before any Swing components are instantiated. This is because the
UIManager.setLookAndFeel() method makes a particular Look And Feel the current default
by loading and initializing that LookAndFeel instance, but it doesnot automatically cause any
existing components to change their look-and-feel.
One aim of the IDE is to reduce the configuration necessary to piece together
multiple development utilities, instead providing the same set of capabilities as a cohesive
unit. Reducing that setup time can increase developer productivity, in cases where
learning to use the IDE is faster than manually integrating all of the individual tools.
Tighter integration of all development tasks has the potential to improve overall
productivity beyond just helping with setup tasks.
IDE Tools
There are many IDE tools available for source code editor, built automation tools and
debugger. Some of the tools are,
Eclipse
NetBeans
Code::Blocks
Code Lite
Dialog Blocks
NetBeans IDE 8.0 is released, also providing new features for Java 8 technologies. It has
code analyzers and editors for working with Java SE 8, Java SE Embedded 8, and Java ME
Embedded 8. The IDE also has new enhancements that further improve its support for Maven
and Java EE with PrimeFaces.
1. Tools for Java 8 Technologies. Anyone interested in getting started with lambdas, method
references, streams, and profiles in Java 8 can do so immediately by downloading NetBeans
IDE 8. Java hints and code analyzers help you upgrade anonymous inner classes to lambdas,
right across all your code bases, all in one go. Java hints in the Java editor let you quickly and
intuitively switch from lambdas to method references, and back again.
Moreover, Java SE Embedded support entails that you’re able to deploy, run, debug or profile
Java SE applications on an embedded device, such as Raspberry PI, directly from NetBeans
IDE. No new project type is needed for this, you can simply use the standard Java SE project
type for this purpose.
1. Tools for Java EE Developers. The code generators for which NetBeans IDE is well
known have been beefed up significantly. Where before you could create bits and
pieces of code for various popular Java EE component libraries, you can now generate
complete PrimeFaces applications, from scratch, including CRUD functionality and
database connections.
Additionally, the key specifications of the Java EE 7 Platform now have new and enhanced
tools, such as for working with JPA and CDI, as well as Facelets.
Let’s not forget to mention in this regard that Tomcat 8.0 and TomEE are now supported, too,
with a new plugin for WildFly in the NetBeans Plugin Manager.
3. Tools for Maven. A key strength of NetBeans IDE, and a reason why many developers
have started using it over the past years, is its out of the box support for Maven. No need to
install a Maven plugin, since it’s a standard part of the IDE. No need to deal with IDE-
specific files, since the POM provides the project structure.And now, in NetBeans IDE 8.0,
there are enhancements to the graph layouting, enabling you to visualize your POM in
various ways, while also being able to graphically exclude dependencies from the POM file,
without touching the XML.
4. Tools for JavaScript. Thanks to powerful new JavaScript libraries and frameworks over
the years, JavaScript as a whole has become a lot more attractive for many developers. For
some releases already, NetBeans IDE has been available as a pure frontend environment, that
is, minus all the Java tools for which it is best known. This lightweight IDE, including Git
versioning tools, provides a great environment for frontend devs. In particular, for users of
AngularJS, Knockout, and Backbone, the IDE comes with deep editor tools, such as code
completion and cross-artifact navigation.In NetBeans IDE 8.0, there’s a very specific focus
on AngularJS, since this is such a dominant JavaScript solution at the moment. From these
controllers, you can navigate, via hyperlinks embedded in the JavaScript editor, to the related
HTML views. And, as shown in this screenshot, you can use code completion inside the
HTML editor to access controllers, and even the properties within the controllers, to help you
accurately code the related artifacts in your AngularJS applications.
Also, remember that there’s no need to download the AngularJS Seed template, since it’s
built into the NetBeans New Project wizard.
5. Tools for HTML5. JavaScript is a central component of the HTML5 Platform, a collective
term for a range of tools and technologies used in frontend development. Popular supporting
technologies are Grunt, a build tool, and Karma, a test runner framework. Both of these are
now supported out of the box in NetBeans IDE 8.0
In an effort to set an independent database standard API for Java, Sun Microsystems
developed Java Database Connectivity, or JDBC. JDBC offers a generic SQL database access
mechanism that provides a consistent interface to a variety of RDBMSs. This consistent
interface is achieved through the use of “plug-in” database connectivity modules, or drivers.
If a database vendor wishes to have JDBC support, he or she must provide the driver for each
platform that the database and Java run on.
JDBC was announced in March of 1996. It was released for a 90 day public review
that ended June 8, 1996. Because of user input, the final JDBC v1.0 specification was
released soon after.
The remainder of this section will cover enough information about JDBC for you to
know what it is about and how to use it effectively. This is by no means a complete overview
of JDBC. That would fill an entire book.
JDBC Goals
Few software packages are designed without goals in mind. JDBC is one that, because
of its many goals, drove the development of the API. These goals, in conjunction with early
reviewer feedback, have finalized the JDBC class library into a solid framework for building
database applications in Java.
The goals that were set for JDBC are important. They will give you some insight as to
why certain classes and functionalities behave the way they do. The eight design goals for
JDBC are as follows:
2. SQL Conformance:
SQL syntax varies as you move from database vendor to database vendor. In an effort
to support a wide variety of vendors, JDBC will allow any query statement to be passed
through it to the underlying database driver. This allows the connectivity module to handle
non-standard functionality in a manner that is suitable for its users.
The JDBC SQL API must “sit” on top of other common SQL level APIs. This goal
allows JDBC to use existing ODBC level drivers by the use of a software interface. This
interface would translate JDBC calls to ODBC and vice versa.
4. Provide a Java interface that is consistent with the rest of the Java system
Because of Java’s acceptance in the user community thus far, the designers feel
that they should not stray from the current design of the core Java system.
SQL Server 2008
History
The history of Microsoft SQL Server begins with the first Microsoft SQL Server
product - SQL Server 1.0, a 16-bit server for the OS/2 operating system in 1989 - and extends
to the current day. As of December 2016 the following versions are supported by Microsoft:
The current version is Microsoft SQL Server 2016, released June 1, 2016. The RTM
version is 13.0.1601.5. SQL Server 2016 is supported on x64 processors only.
SQL Process
When you are executing an SQL command for any RDBMS, the system determines
the best way to carry out your request and SQL engine figures out how to interpret the task.
There are various components included in the process. These components are Query
Dispatcher, Optimization Engines, Classic Query Engine and SQL Query Engine, etc. Classic
query engine handles all non-SQL queries but SQL query engine won't handle logical files.
Data storage
Data storage is a database, which is a collection of tables with typed columns. SQL
Server supports different data types, including primary types such as Integer, Float, Decimal,
Char (including character strings), Varchar (variable length character strings), binary (for
unstructured blobs of data), Text (for textual data) among others. The rounding of floats to
integers uses either Symmetric Arithmetic Rounding or Symmetric Round Down (fix)
depending on arguments: SELECT Round(2.5, 0) gives 3.
Microsoft SQL Server also allows user-defined composite types (UDTs) to be defined
and used. It also makes server statistics available as virtual tables and views (called Dynamic
Management Views or DMVs). In addition to tables, a database can also contain other objects
including views, stored procedures, indexes and constraints, along with a transaction log. A
SQL Server database can contain a maximum of 231 objects, and can span multiple OS-level
files with a maximum file size of 260 bytes (1 exabyte). The data in the database are stored in
primary data files with an extension .mdf. Secondary data files, identified with a .ndf
extension, are used to allow the data of a single database to be spread across more than one
file, and optionally across more than one file system. Log files are identified with the .ldf
extension
Buffer management
SQL Server buffers pages in RAM to minimize disk I/O. Any 8 KB page can be
buffered in-memory, and the set of all pages currently buffered is called the buffer cache. The
amount of memory available to SQL Server decides how many pages will be cached in
memory. The buffer cache is managed by the Buffer Manager. Either reading from or writing
to any page copies it to the buffer cache. Subsequent reads or writes are redirected to the in-
memory copy, rather than the on-disc version. The page is updated on the disc by the Buffer
Manager only if the in-memory cache has not been referenced for some time. While writing
pages back to disc, asynchronous I/O is used whereby the I/O operation is done in a
background thread so that other operations do not have to wait for the I/O operation to
complete. Each page is written along with its checksum when it is written.
SQL Server allows multiple clients to use the same database concurrently. As such, it
needs to control concurrent access to shared data, to ensure data integrity-when multiple
clients update the same data, or clients attempt to read data that is in the process of being
changed by another client. SQL Server provides two modes of concurrency control:
pessimistic concurrency and optimistic concurrency. When pessimistic concurrency control is
being used, SQL Server controls concurrent access by using locks. Locks can be either shared
or exclusive. Exclusive lock grants the user exclusive access to the data-no other user can
access the data as long as the lock is held. Shared locks are used when some data is being
read-multiple users can read from data locked with a shared lock, but not acquire an exclusive
lock. The latter would have to wait for all shared locks to be released.
SQLCMD
SQLCMD is a command line application that comes with Microsoft SQL Server, and
exposes the management features of SQL Server. It allows SQL queries to be written and
executed from the command prompt. It can also act as a scripting language to create and run a
set of SQL statements as a script. Such scripts are stored as a .sql file, and are used either for
management of databases or to create the database schema during the deployment of a
database.
SQLCMD was introduced with SQL Server 2005 and this continues with SQL Server
2012 and 2014. Its predecessor for earlier versions was OSQL and ISQL, which is
functionally equivalent as it pertains to TSQL execution, and many of the command line
parameters are identical, although SQLCMD adds extra versatility.
The OLAP Services feature available in SQL Server version 7.0 is now called MY
SQL Server Analysis Services. The term OLAP Services has been replaced with the term
Analysis Services. Analysis Services also includes a new data mining component. The
Repository component available in SQL Server version 7.0 is now called Microsoft MY SQL
Server Meta Data Services. References to the component now use the term Meta Data
Services. The term repository is used only in reference to the repository engine within Meta
Data Services.
They are,
1. TABLE
2. QUERY
3. FORM
4. REPORT
5. MACRO
1) TABLE:
a) Design View
b) Datasheet View
A)Design View
To build or modify the structure of a table, we work in the table design view. We can
specify what kind of dates will be holed.
B) Datasheet View
To add, edit or analyses the data itself, we work in table’s datasheet view mode.
2) QUERY:
A query is a question that has to be asked to get the required data. Access gathers
data that answers the question from one or more table. The data that make up the answer is
either dynast (if you edit it) or a snapshot (it cannot be edited).Each time we run a query, we
get latest information in the dynast. Access either displays the dynast or snapshot for us to
view or perform an action on it, such as deleting or updating.
3) FORMS:
A form is used to view and edit information in the database record. A form displays only
the information we want to see in the way we want to see it. Forms use the familiar controls
such as textboxes and checkboxes. This makes viewing and entering data easy. We can work
with forms in several views. Primarily there are two views,They are,
a) Design View
b) Form View
To build or modify the structure of a form, we work in form’s design view. We can add
control to the form that are bound to fields in a table or query, includes textboxes, option
buttons, graphs and pictures.
4) REPORT:
A report is used to view and print the information from the database. The report can
ground records into many levels and compute totals and average by checking values from
many records at once. Also the report is attractive and distinctive because we have control
over the size and appearance of it.
5) MACRO:
A macro is a set of actions. Each action in a macro does something, such as opening a form
or printing a report .We write macros to automate the common tasks that work easily and
save the time.
Can contain SQL Procedural Language statements and features which support the
statements.
Are supported in the entire DB2 family brand of database products in which many if
Are easy to implement, because they use a simple high-level, strongly typed language.
Allow you to return multiple result sets to the caller or to a client application.
Allow you to easily access the SQLSTATE and SQLCODE values as special
variables.
in other languages.
Support recursion.
Data Design
Schema Design
User table
Shared table
Request table
Message public
Message Name table
-- ----------------------------
-- Table structure for msgname
-- ----------------------------
DROP TABLE IF EXISTS `msgname`;
CREATE TABLE `msgname` (
`id` int(233) NOT NULL auto_increment,
`sid` int(233) default NULL,
`encptmsg` varchar(233) default NULL,
`ogmsg` varchar(233) default NULL,
`sentby` varchar(233) default NULL,
`sendto` varchar(233) default NULL,
`pubkey` varchar(233) default NULL,
`prikey` int(233) default NULL,
`reqid` int(233) default NULL,
`updt` int(233) default NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=latin1;
-- ----------------------------
-- Table structure for msgpub
-- ----------------------------
DROP TABLE IF EXISTS `msgpub`;
CREATE TABLE `msgpub` (
`id` int(122) NOT NULL auto_increment,
`sid` int(122) default NULL,
`encptmsg` varchar(122) default NULL,
`ogmsg` varchar(122) default NULL,
`sentby` varchar(122) default NULL,
`sendto` varchar(122) default NULL,
`pubkey` varchar(122) default NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=21 DEFAULT CHARSET=latin1;
-- ----------------------------
-- Table structure for request
-- ----------------------------
DROP TABLE IF EXISTS `request`;
CREATE TABLE `request` (
`sno` int(130) NOT NULL auto_increment,
`id` int(130) default NULL,
`sentby` varchar(120) default NULL,
`sendto` varchar(120) default NULL,
`type` varchar(130) default NULL,
`prikey` varchar(130) default NULL,
`reqid` int(130) default NULL,
PRIMARY KEY (`sno`)
) ENGINE=InnoDB AUTO_INCREMENT=11 DEFAULT CHARSET=latin1;
Procedural Design
Logic Diagrams
SYSTEM ARCHITECTURE:
DATA FLOW DIAGRAM:
1. The DFD is also called as bubble chart. It is a simple graphical formalism that can be
used to represent a system in terms of input data to the system, various processing
carried out on this data, and the output data is generated by this system.
2. The data flow diagram (DFD) is one of the most important modeling tools. It is used
to model the system components. These components are the system process, the data
used by the process, an external entity that interacts with the system and the
information flows in the system.
3. DFD shows how the information moves through the system and how it is modified by
a series of transformations. It is a graphical technique that depicts information flow
and the transformations that are applied as data moves from input to output.
4. DFD is also known as bubble chart. A DFD may be used to represent a system at any
level of abstraction. DFD may be partitioned into levels that represent increasing
information flow and functional detail.
Login
Search Product
Add Product Maintain Product
Details Details
Purchase Product
Using Credit Card
Spike Detection
Communal Detection
If authorize purchase
product else report to
admin
UML DIAGRAMS
UML stands for Unified Modeling Language. UML is a standardized general-purpose
modeling language in the field of object-oriented software engineering. The standard is
managed, and was created by, the Object Management Group.
The goal is for UML to become a common language for creating models of object
oriented computer software. In its current form UML is comprised of two major components:
a Meta-model and a notation. In the future, some form of method or process may also be
added to; or associated with, UML.
The Unified Modeling Language is a standard language for specifying, Visualization,
Constructing and documenting the artifacts of software system, as well as for business
modeling and other non-software systems.
The UML represents a collection of best engineering practices that have proven
successful in the modeling of large and complex systems.
The UML is a very important part of developing objects oriented software and the
software development process. The UML uses mostly graphical notations to express the
design of software projects.
GOALS:
The Primary goals in the design of the UML are as follows:
1. Provide users a ready-to-use, expressive visual modeling Language so that they can
develop and exchange meaningful models.
2. Provide extendibility and specialization mechanisms to extend the core concepts.
3. Be independent of particular programming languages and development process.
4. Provide a formal basis for understanding the modeling language.
5. Encourage the growth of OO tools market.
6. Support higher level development concepts such as collaborations, frameworks,
patterns and components.
7. Integrate best practices.
USE CASE DIAGRAM:
A use case diagram in the Unified Modeling Language (UML) is a type of behavioral
diagram defined by and created from a Use-case analysis. Its purpose is to present a graphical
overview of the functionality provided by a system in terms of actors, their goals (represented
as use cases), and any dependencies between those use cases. The main purpose of a use case
diagram is to show what system functions are performed for which actor. Roles of the actors
in the system can be depicted.
SEARCH PRODUCT
SYSTEM
COMMUNAL DETECTION
SPIKE DETECTION
TRANSACTION SUCCESSFUL
CLASS DIAGRAM:
In software engineering, a class diagram in the Unified Modeling Language (UML) is a type
of static structure diagram that describes the structure of a system by showing the system's
classes, their attributes, operations (or methods), and the relationships among the classes. It
explains which class contains information.
ADMIN
USER
+PRODUCT NO
+NAME +PRICE
+CREDIT CARD NO +VALIDITY
+DATE OF BIRTH
+ADD()
+PURCHASE() +MODIFY()
+VIEW DETAILS() +DELETE()
+VIEW()
PRODUCT DETAILS
+MOBILE
+COMPUTER
+LAPTOPS
+SHIRTS
+COST
+ADD()
+MODIFY()
+DELETE()
DETECTION SYSTEM
+CARD NO
+DATE OF BIRTH
+SPIKE DETECTION()
+COMMUNAL DETECTION()
SEQUENCE DIAGRAM:
A sequence diagram in Unified Modeling Language (UML) is a kind of interaction diagram
that shows how processes operate with one another and in what order. It is a construct of a
Message Sequence Chart. Sequence diagrams are sometimes called event diagrams, event
scenarios, and timing diagrams.
: ADMIN : USER
: SYSTEM
1 : login()
2 : validate()
3 : login sucessfull()
4 : add product details()
6 : login()
7 : validate()
8 : purchase details()
10 : communal detection()
11 : spike detection()
ACTIVITY DIAGRAM:
Activity diagrams are graphical representations of workflows of stepwise activities and
actions with support for choice, iteration and concurrency. In the Unified Modeling
Language, activity diagrams can be used to describe the business and operational step-by-step
workflows of components in a system. An activity diagram shows the overall flow of control.
login
admin
user
view product add,modify, delete details
communal detection
Spike detection
unauthorize
report to admin
authorize
IMPLEMENTATION
MODULE DESCRIPTION:
IMPLEMENTATION
Implementation is the stage of the project when the theoretical design is turned out
into a working system. Thus it can be considered to be the most critical stage in achieving a
successful new system and in giving the user, confidence that the new system will work and
be effective.
MODULES:
2. Login
3. Security information
4. Transaction
5. Verification
MODULE DESCRIPTION
LOGIN
In Login Form module presents site visitors with a form with username and password
fields. If the user enters a valid username/password combination they will be granted access
to additional resources on website. Which additional resources they will have access to can
be configured separately.
SECURITY INFORMATION
In Security information module it will get the information detail and its store’s in
database. If the business process lost then the Security information module form arise. It has
a set of question where the user has to answer the correctly to move to the transaction section.
It contain informational privacy and informational self-determination are addressed squarely
by the invention affording persons and entities a trusted means to user, secure, search,
process, and exchange personal and/or confidential information.
TRANSACTION
VERIFICATION
MODULES:
Admin module
User module
Identify Crime User
Online Shopping
Admin module:
In this module, the admin can add product details (product name, price, validity etc..)
based on the category likes mobiles, computers, laptops etc.. And maintain the product
details. The user enter their credit card details, the credit card is valid by Communal
Detection and Spike Detection. If the card details is valid, the user can purchase their items
else it report to the admin as “fraud transaction”.
User module:
The user can select purchasing products displayed in the home page or search the
product using keyword or based on category. Then user can purchase the product using
credit/debit card. To purchase, the user need to provide the following details like(credit card
number, card holder name, date of birth, credit card provider). If the credit card is valid the
user is allowed to purchase the product else it reports to the admin as “fraud transaction
occurs”
4. Online shopping
In the module, we developed a website for online shopping. The user can purchase a
products using credit card. If the fraud user uses their credit card to purchase item, the bank
identify the fraud user using the weight and graph of the user in the bank database.
CHAPTER – VII
SYSTEM TESTING
TYPES OF TESTING
UNIT TESTING
Unit testing involves the design of test cases that validate that the internal program
logic is functioning properly, and that program input produces valid outputs. All decision
branches and internal code flow should be validated. It is the testing of individual software
units of the application .it is done after the completion of an individual unit before
integration. This is a structural testing, that relies on knowledge of its construction and is
invasive. Unit tests perform basic tests at component level and test a specific business
process, application, and/or system configuration. Unit tests ensure that each unique path of a
business process performs accurately to the documented specifications and contains clearly
defined inputs and expected results.
Unit testing is usually conducted as part of a combined code and unit test phase of the
software lifecycle, although it is not uncommon for coding and unit testing to be conducted as
two distinct phases.
Field testing will be performed manually and functional tests will be written in detail.
Test objectives:
INTEGRATION TESTING
The task of the integration test is to check that components or software applications,
e.g. components in a software system or – one step up – software applications at the company
level – interact without error.
FUNCTIONAL TESTING
Functional tests provide systematic demonstrations that functions tested are available
as specified by the business and technical requirements, system documentation and user
manuals.
Business process flows; data fields, predefined processes, and successive processes
must be considered for testing. Before functional testing is complete, additional tests are
identified and the effective value of current tests is determined.
SYSTEM TESTING
System testing ensures that the entire integrated software system meets requirements.
It tests a configuration to ensure known and predictable results. An example of system testing
is the configuration oriented system integration test. System testing is based on process
descriptions and flows, emphsizing pre-driven process links and integration points.
ACCEPTANCE TESTING
User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional
requirements.
OTHER TESTING METHODOLOGIES
User Acceptance of a system is the key factor for the success of any system. The
system under consideration is tested for user acceptance by constantly keeping in touch with
the prospective system users at the time of developing and making changes wherever
required. The system developed provides a friendly user interface that can easily be
understood even by a person who is new to the system.
Output Testing
After performing the validation testing, the next step is output testing of the proposed
system, since no system could be useful if it does not produce the required output in the
specified format. Asking the users about the format required by them tests the outputs
generated or displayed by the system under consideration. Hence the output format is
considered in 2 ways – one is on screen and another in printed format.
Validation Checking
Text Field:
The text field can contain only the number of characters lesser than or equal to its
size. The text fields are alphanumeric in some tables and alphabetic in other tables. Incorrect
entry always flashes and error message.
Numeric Field:
The numeric field can contain only numbers from 0 to 9. An entry of any character
flashes an error messages. The individual modules are checked for accuracy and what it has
to perform. Each module is subjected to test run along with sample data. The individually
tested modules are integrated into a single system. Testing involves executing the real data
information is used in the program the existence of any program defect is inferred from the
output. The testing should be planned so that all the requirements are individually tested.
UNIT TESTING
Expected
S.No Procedure Test Condition Test Data Data
Result on
3. Click login 3.Authentication Fail / Pass database
button verification verification
INTEGRATION TESTING
Expected
S.No Procedure Test Condition Test Data Data
New Patient
name Raymond
3. Add new 3.Enter Patient added
doctor name Men
product details valid on successfully
report name Fever
details database
Quantity 1
Serveries charge1500
Description shirt quality
Upload img name.jpeg
Sample Coding
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<style type="text/css">
<!--
.style1 {
font-size: 18px;
color: #FF0000;
font-weight: bold;
-->
</style>
</head>
<body>
<div id="wrapper">
<div id="menu">
<ul>
<li class="current_page_item"><span><a
href="index.html">Home</a></span></li>
<li><a href="CompanyLogin.jsp">Transport
Company</a></li>
<li><a href="UserLogin.jsp">User</a></li>
<li><a href="#"></a></li>
<li><a href="#"></a></li>
</ul>
</div>
<div id="header">
<div id="logo">
</div>
<div id="search">
<fieldset>
</form>
</div>
</div>
<div id="splash"> </div>
<div id="page">
<div id="page-bgtop">
<div id="page-bgbtm">
<div id="content">
<div class="post">
<div class="entry">
</div>
</div>
<div class="post"></div>
<div id="sidebar">
<ul><li><h2>Menu</h2>
<ul>
<li><a
href="UserMain.jsp">Home</a></li>
<li><a href="UserProfile.jsp">View
My Profile</a></li>
<li><a href="U_ManageBankAccount.jsp">Manage
Bank Account </a></li>
<li><a
href="U_CreditCardRequest.jsp">Request Credit Card </a></li>
<li><a href="U_CreditCard.jsp">View
Credit Card Details </a></li>
<li><a href="U_ViewCardTransactions.jsp">View
Card Transactions </a></li>
<li><a href="U_ViewTransportCompany.jsp">View
all Transport Company and Book Tickets by Selecting Company</a></li>
</ul>
</li>
<li>
<h2> </h2>
</li>
</ul>
</div>
</div>
</div>
</div>
</div>
</html>
User Register.jsp
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<style type="text/css">
<!--
.style1 {
font-size: 18px;
color: #FF0000;
font-weight: bold;
-->
</style>
</head>
<body>
<div id="wrapper">
<div id="menu">
<ul>
<li class="current_page_item"><span><a
href="index.html">Home</a></span></li>
<li><a href="AdminLogin.jsp">Bank Admin</a></li>
<li><a href="UserLogin.jsp">User</a></li>
<li><a href="#"></a></li>
<li><a href="#"></a></li>
</ul>
</div>
<div id="header">
<div id="logo">
</div>
<div id="search">
<fieldset>
</fieldset>
</form>
</div>
</div>
<div id="splash"> </div>
<div id="page">
<div id="page-bgtop">
<div id="page-bgbtm">
<div id="content">
<div class="post">
<div class="entry">
<%
try
Statement st=connection.createStatement();
ResultSet rs=st.executeQuery(query);
while ( rs.next() )
a1.add(rs.getString("bank"));
%>
<tr>
</label></td>
<td><p align="left">
<option>--Select--</option>
<%
for(int i=0;i<a1.size();i++)
%>
<option><%=
a1.get(i)%></option>
<%
%>
</select></p> </td>
</tr>
<tr>
</span>
<label for="label">
</label>
<span class="style35">
<label for="name"></label>
</span></td><td width="371"><input id="name" name="userid" class="text"
/></td>
</tr>
<tr>
</tr>
<tr>
<td height="47"> </td>
</tr>
<tr>
<td height="49"></td>
</tr>
</table>
<%
connection.close();
catch(Exception e)
out.println(e.getMessage());
%>
</form>
</div>
</div>
<div class="post"></div>
</div>
<ul><li><h2>Menu</h2>
<ul>
<li><a href="index.html">Home</a></li>
</ul>
</li>
<li>
<h2> </h2>
</li>
</ul>
</div>
</div>
</div>
</div>
</div>
Estimates: predicted duration and cost of the project should be backed up with the team’s
reasoning and circumstances for potential re-estimation.
Project plan: here, the plan states an approximate schedule, the project’s main stages, and
available resources.
Development phases: a project plan provides only a general description of the development
process. You can go into more detail when describing each phase individually. For every
phase, a team specifies its duration, objectives, and required resources.
Objectives: each phase and product iteration should be driven by clear goals. Make the list of
objectives for every stage of product development. The product owner and the development
team should keep these objectives realistic and clear to all project participants.
Release plans: the team can give an estimate on the expected release date and specify its
status (beta-, demo, alpha, etc.)
Resourcing: this section describes available and unavailable skills, hardware, and software.
For each stage, there should be individual resourcing sections.
Conceptual Models
Comparison model highlighting conceptual model role in system process
A conceptual model's primary objective is to convey the fundamental principles and basic
functionality of the system which it represents. Also, a conceptual model must be developed
in such a way as to provide an easily understood system interpretation for the model's users.
A conceptual model, when implemented properly, should satisfy four fundamental objectives.
1. Enhance an individual's understanding of the representative system
2. Facilitate efficient conveyance of system details between stakeholders
3. Provide a point of reference for system designers to extract system specifications
4. Document the system for future reference and provide a means for collaboration
s
Fig: Conceptual Models
The conceptual model plays an important role in the overall system development life cycle.
Depicts the role of the conceptual model in a typical system development scheme. It is clear
that if the conceptual model is not fully developed, the execution of fundamental system
properties may not be implemented properly, giving way to future problems or system
shortfalls. These failures do occur in the industry and have been linked to; lack of user input,
incomplete or unclear requirements, and changing requirements. Those weak links in the
system design and development process can be traced to improper execution of the
fundamental objectives of conceptual modeling. The importance of conceptual modeling is
evident when such systemic failures are mitigated by thorough system development and
adherence to proven development objectives/techniques.
Gantt Chart
3 GUI creation The GUI files are created and It is important to finalize on the User
working on it Interface at this stage itself so that
development and testing can proceed
with the actual UserInterface itself.
4 High-level and Listing down all possible scenarios The scenarios should map to the
Detailed Design and then coming up with flow- requirement specification
charts or pseudo code to handle
the scenario.
8 Final Review Issues found during the previous During the final review of the
milestone are fixed and the system project, it should be checked that all
is ready for the final review. the requirements specified during
milestone number 1 are fulfilled