Imt2019079 Imt2019072

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

IIITB

CS 816: Software Production Engineering

Major Project Report

TICKET MANAGEMENT SYSTEM

.
Under the guidance of,

• Prof. B. Thangaraju, IIITB


• Teaching Assistant- Sanandan Sharma, MT2021114

A Project by,
Fahed Shaikh IMT2019079,
Rohit Oze, IMT2019072.
1 Overview
Ticket system application built with the MERN stack that allow teams to manage bugs, requests, and other
support tickets. It has various features such as,

• DashBoard Home: We can view all the tickets raised by the users and there status.
• Raise a Ticket: We can raise tickets to the users regarding some work and we can mention their priority
• Manage User: We can create/add or delete users.
• Manage Projects: We can create/add or delete projects

2 Required Installments
1. Git: https://git-scm.com/book/en/v2/Getting-Started-Installing-Git
2. Java: https://www.oracle.com/in/java/technologies/downloads/#java11
3. IntelliJ IDE: https://www.jetbrains.com/idea/download/#section=mac
4. Jenkins: https://www.jenkins.io/doc/book/installing/
5. Maven: https://maven.apache.org/install.html
6. Docker: https://docs.docker.com/engine/install/
7. Ansible: https://docs.ansible.com/ansible/latest/installation_guide/index.html
8. NPM: brew install npm
9. Nodejs: brew install nodejs

3 Creating Repositories
We also need to make accounts and repositories accordingly in the following platforms.

• GitHub: https://github.com/fahedworks-24x7/SPE-Major-Project
• DockerHub: https://hub.docker.com/repository/docker/fahed6739/ticket-system-master-frontend/
general

4 DevOps Toolchain
• Git: Git is a source code management tool where developers collaborate on code and track the changes
committed in the codebase.
• JUnit: JUnit is a testing framework for Java developers to automate the testing process. The testing
framework can be used to write code, perform unit tests and generate reports on test coverage and test
results.
• Maven: Maven is a build automation tool that helps manage dependencies and build Java-based projects.
It is used in many processes like compiling the source code, creating a binary file, and running unit tests.
• Jenkins: Jenkins is a continuous integration and continuous delivery (CI/CD) tool that continuously
integrates our code and automates the build, test, and deployment processes.
• Docker: Docker is a containerization platform used to create a container and push it to the Docker Hub.
It enables the developers to package applications and dependencies into portable, lightweight containers.
• Ansible: Ansible is a configuration management (deployment) tool to perform and automate the config-
uration management and deployment of the applications.
• ELK: Elasticsearch, Logstash, and Kibana, are open- source tools used for centralized logging and log
analysis. ELK Stack to monitor the application and generate alerts on errors and failures.
• log4j: log4j is a Java-based logging utility that contains a flexible logging infrastructure for our applica-
tions. It can be used to log messages from applications, enabling developers to keep a track on the flow
of the code and diagnose errors which are observed during runtime.

1
5 DevOps
What is DevOps?-
DevOps is a technique of software development that places a strong emphasis on communication and coopera-
tion between IT specialists and software developers as well as automation and continuous software delivery. By
removing conventional boundaries between the various teams and processes involved in software development,
such as development, testing, deployment, and operations, DevOps seeks to increase the effectiveness and speed
of software development and delivery. This is accomplished by using tools and procedures like continuous inte-
gration, continuous delivery, and continuous deployment that automate many of the manual operations required
in software development and deployment. DevOps’ main objective is to help businesses provide high-quality
software more quickly and reliably while simultaneously increasing overall productivity and cutting expenses.

Why do we need DevOps?-

• Faster time-to-market
• Improved collaboration and communication between teams
• Increased agility and flexibility to respond to changing business needs

• Improved software quality and reliability


• Reduced costs and increased efficiency through automation
• Ability to deliver software updates and new features quickly and frequently
• Improved customer satisfaction by delivering high-quality software faster

• Better alignment of business and IT goals.

6 Source Code Management


Source Code Management (SCM) is the process of managing and tracking changes to source code over time. It
is a crucial component of software development, as it enables developers to collaborate on code changes, keep
track of revisions, and maintain a record of all changes made to the codebase. SCM tools provide developers
with version control, branching, and merging capabilities, allowing them to work on different versions of code
simultaneously and merge changes when necessary. SCM also facilitates code review, testing, and deployment
processes, ensuring that changes are properly vetted and integrated into the final product.

In our project, we have made use of Git and Github.

2
• Creating a Github Repository As soon as we enter the repositories section in our github profile, we
can find a new option and on clicking on it, we see something like this. Scrolling down, we can find the
create option to finalize creating our repository.
• Adding our project to the Repository Using the following commands, one can upload all his work
from the local system to his github repository. To push changes to the repository, use the following
commands
– $ git init
– $ git add

– $ git commit -m "Any Text"


– $ git branch -M main
– $ git remote add origin < GitHub repo URL >
– $ git push origin master

• My Github Repository This is how my github repository looks after following the above steps-

7 Continuous Integration / Continuous Deployment


Jenkins is an open-source automation tool written in Java with plugins built for continuous integration. Jenkins
is utilized to continuously build and test software projects, simplifying the process for developers to integrate
changes, and allowing users to obtain up-to-date builds with ease. Additionally, Jenkins enables continuous
delivery of software by integrating with a wide range of testing and deployment technologies. We utilize Jenkins
Pipeline for our project. Jenkins Pipeline is a suite of plugins that allows developers to create and manage
continuous delivery pipelines as code. It enables developers to define the entire software delivery process in a
single pipeline script. After Jenkins is installed, browse to http://localhost:8080 (or whichever port you
configured for Jenkins when installing it)
Jenkins can be installed by following these steps:

• $ wget -q -O - https://pkg.jenkins.io/debian-stable/jenkins.io.key | sudo apt-


key add -
• $ sudo sh -c ’echo deb http://pkg.jenkins.io/debian-stable binary/
> /etc/apt/sources.lis
• $ sudo apt install ca-certificates
• $ sudo apt-get update

3
• $ sudo apt-get install jenkins

1. Jenkins Plugins are a key feature of Jenkins which allow developers to extend the functionality of the software
and customize it to meet their specific needs. Jenkins has a vast library of over 1,500 plugins, with new plugins
being developed and added to the library on a regular basis.
Browse to Manage Jenkins → Plugin Manager → Available Plugins and install the following plu-
gins -

• Git plugins GitHub plugins


• Maven Integration
• Docker plugin Docker pipeline
• Ansible plugin

2. Under the Global Tool Configuration options in Manage Jenkins, we have to specify the config-
uration for Maven, Git and Ansible. Though they are almost everytime automatically populated, it is a good
idea to have a check.

3. Under the Manage Credentials options in Manage Jenkins, we have to specify the login cre- dentials
for Docker Hub and GitHub.

4
7.1 Jenkins Pipeline
A pipeline in Jenkins is a mechanism to specify and automate the procedures necessary to develop, test, and
release software. A Jenkinsfile, a text file that lists the stages, actions, and configurations necessary to carry
out a continuous delivery pipeline, is normally where a pipeline is defined. The Jenkins web interface and
the Jenkinsfile itself both offer configuration and management options for Jenkins pipelines, enabling version-
controlled and automated pipeline administration. Our pipeline script’s many phases are:

1. Git Pull
2. Maven Build

3. Create Docker Image


4. Publish Docker Image
5. Ansible Deploy

Pipeline Script:

The steps for creating this pipeline are:

Step1: Pull the project from GitHub. After clicking on Generate Pipeline Script, we get the script to add in
the first stage of pipeline.

Step2: Next stage is Maven Build and for this script command mvn clean install has to be added.

Step3: After this Docker image has to be built. The main role of this is to build image using docker file.

Step4: The image built is pushed to docker hub. Docker Hub repositories allow you share container images with
your team, customers, or the Docker community at large. Docker images are pushed to Docker Hub through
the docker push command. A single Docker Hub repository can hold many Docker images.

Step5: This docker image is pulled to managed node, from docker hub using deploy-image.yml file. Next comes
deployment using ansible. For this install ansible and add its path to jenkins so that it can access it. Check
the path using which command. Build the job then to deploy the application. If everything is successful, you
are done.

7.2 Jenkins Run


The pipeline script specifies all the steps and the build is triggered based on webhooks. The images below depict
my Jenkins run and the build history respectively.

5
8 Docker

8.1 Docker File


A text file called a Dockerfile has a collection of instructions for creating a Docker image. The base image to
use, any additional packages or applications that must be installed, configuration settings, and any other steps
that must be made to build the image are normally all described in a Dockerfile. Developers may automate
the process of creating Docker images by using a Dockerfile, which makes it simpler to maintain consistency
across many environments and platforms. It is simpler to work together on developing and delivering apps in a
Dockerized environment when Dockerfiles are version-controlled and shared among team members. Using the
docker build command, a Dockerfile may be used to construct a Docker image. The resulting image can then
be used to run containers that include the specified software and configurations.
Here is an image of the docker file we have for our Frontend-

6
And Backend-

1. FROM- Specifies the base image that the new image will be built upon.
2. COPY- Specifies the command to be executed when the container starts.
3. WORKDIR- Sets the working directory for any RUN, CMD, ENTRYPOINT, COPY, and ADD instructions
that follow.
4. CMD- Specifies the command to be executed when the container starts.

Here what we are doing is: - It creates a alpine image instead of a full node image which use less memory, and
gives better performance, security and maintainability.
- First we’ll set the base directory as app and copy all the required files (relative from the root location )
- We’ll also copy all the files from the package.json and package-lock.jso since they contain all the dependencies
- Now we’ll run npm install which will install all the files in the dependencies
- Now we’ ll copy all the files from the root folder of our directory and start the production
Then we’ll build and tag the image and push it to dockerhub,this process is handled in github actions via the
docker plugin provided.

8.2 DockerHub
Pushing the produced image to Docker Hub. Developers may save and share their Docker container images with
other developers or users via Docker Hub, a cloud-based repository. In essence, it serves as a single repository
for Docker images, allowing programmers to browse, download, and share pre-built Docker images that may be
used to run programmes in containers.

9 Ansible
Ansible is a free and open source automation tool for managing and configuring computer systems. Users may
automate repetitive processes like software installations, configuration management, and application deployment
using its straightforward, agentless architecture. The Docker image is downloaded by Ansible from the Docker
Hub and installed on several computers.
First things first, to install Ansible, open the terminal and run the following commands.-

7
1. sudo apt install openssh-server
2. service start ssh

3. ssh-keygen -t rsa
4. ssh-copy-id <username>@<IP>
5. sudo apt update

6. sudo apt install ansible


7. ansible -version

1. A configuration file known as an inventory file lists the hosts and groups of hosts of whome Ansible will
take care of. There is a list of hostnames or IP addresses in the inventory file that are the connection and task
execution of Ansible.
2. A playbook in Ansible is a file or collection of files that specify a series of actions to be carried out on one
or more hosts. To automate difficult processes like configuration management, application deployment, and
orchestration, playbooks are written in YAML format. One or more ”plays”—a collection of operations that
are carried out on a number of hosts—make up a playbook. Each play lists the hosts it applies to, along with
any variables, tasks, and handlers needed to carry out the intended activities.

10 Running The Project

1. Dashboard Home

8
2. Development Usage

3. Manage Projects

4. Manage Users

9
5. Submit Ticket

11 Continuos Monitoring
The ELK stack is an acronym used to describe a stack that comprises of three popular projects: Elasticsearch,
Logstash, and Kibana. Often referred to as Elasticsearch, the ELK stack gives you the ability to aggregate
logs from all your systems and applications, analyze these logs, and create visualizations for application and
infrastructure monitoring, faster troubleshooting, security analytics, and more.

E=Elasticsearch
Elasticsearch is a distributed search and analytics engine built on Apache Lucene. Support for various lan-
guages, high performance, and schema-free JSON documents makes Elasticsearch an ideal choice for various log
analytics and search use cases.

L=Logstash
Logstash is an open-source data ingestion tool that allows you to collect data from a variety of sources, transform
it, and send it to your desired destination. With pre-built filters and support for over 200 plugins, Logstash
allows users to easily ingest data regardless of the data source or type.

K=Kibana
Kibana is a data visualization and exploration tool for reviewing logs and events. Kibana offers easy-to-use,
interactive charts, pre-built aggregations and filters, and geospatial support and making it the preferred choice
for visualizing data stored in Elasticsearch.

The ELK Stack fulfills a need in the log analytics space. As more and more of your IT infrastructure move to
public clouds, you need a log management and analytics solution to monitor this infrastructure as well as process
any server logs, application logs, and clickstreams. The ELK stack provides a simple yet robust log analysis
solution for your developers and DevOps engineers to gain valuable insights on failure diagnosis, application
performance, and infrastructure monitoring – at a fraction of the price.

If your system has less than 16GB RAM, you can work on a cloud instead of downloading ELK stack in your
system.

Link: https://cloud.elastic.co

Steps:

1. Log4j2.xml file
2. Logs screenshot
3. Create a new deployment webpage
4. Deployment created

10
5. Downloading Login/Signup logs from Okta Dashboard

6. Kibana opened
7. Upload the log file here
8. Viewing index pattern
9. Viewing Visualization of the location of login/signup users

10. Viewing visualization of the number of logins/signups

11

You might also like