Adiya Sgh Intern
Adiya Sgh Intern
Adiya Sgh Intern
Internship Report
Submitted by
ADITYA SINGH
230240170006
I declare that the work embodied in this Internship report is my own original work carried out by
me under the supervision of my own for the session 28 Nov to 12 Dec at “Ybi Foundation”.
The matter embodied in this internship report has not been submitted elsewhere for the award of
any other degree. I declare that I have faithfully acknowledged, given credit to and referred to the
researchers wherever the work has been cited in the text and the body of the thesis. I further
certify that I have not willfully lifted up some other’s work, Para, text, data, results, etc. reported
in the journals, books, magazines, reports, dissertations, thesis, etc., or available at web-sites and
have included them in this internship report and cited as my own work.
Place: Roorkee
ACKNOWLEDGEMENT
I am very happy to greatly acknowledge the numerous personalities involved in lending their
help to make our project “Data Science & AIML Internship” a successful one.
I take this opportunity to express our deep sense of gratitude to our honorable Director “Dr.
Parag Jain” for providing an academic climate in the college that made this endeavor possible.
I give my whole hearted admiration and a little gratitude to “Lokesh Kumar”, HOD,
“DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING”, Roorkee Institute of
Technology, Roorkee for his inspiration, valuable guidance, encouragement, suggestion, and
overall help throughout.
We would like to express our sincere gratitude to our internship/project coordinator “Mrs.
Pranita Singh”, Mentor, “DEPARTMENT OF COMPUTER SCIENCE &
ENGINEERING”, Roorkee Institute of Technology, Roorkee, for his kind support and
encouragement throughout this course of work.
Finally, we express our gratitude to all the Teaching and Non-Teaching staff of
“DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING”, Roorkee Institute of
Technology, Roorkee for their timely support and suggestions.
TABLE OF CONTENTS
Contents
ABSTRACT
BACKGROUND OF COMPANY/ORGANIZATION
PROGRAM AND OPPORTUNITIES
BENEFIT OF COMPANY/ORGANIZATION
1. INTRODUCTION
2. ANALYSIS
.
2. ANALYSIS
The software requirement specification can produce at the culmination of the analysis task. The
function and performance allocated to software as part of system engineering are refined by
established a complete information description, a detailed functional description, are presentation
of system behaviour, and indication of performance and design constrain, appropriate validate
criteria, and other information pertinent to requirements.
Software Requirements:
Hardware Requirement:
# AIML Technologies
1. Machine Learning Frameworks: TensorFlow, PyTorch, Keras, or Scikit-
learn for building and training machine learning models.
2. Deep Learning Frameworks: TensorFlow, PyTorch, or Keras for building
and training deep neural networks.
3. Natural Language Processing (NLP) Libraries: NLTK, spaCy, or Stanford
CoreNLP for text processing and analysis.
4. Computer Vision Libraries: OpenCV, Pillow, or scikit-image for image
processing and analysis.
5. Reinforcement Learning Libraries: Gym, Universe, or RLlib for training
reinforcement learning agents.
5. CODING
Throughout the 2-week internship program at Ybi Foundation, I had
the opportunity to work on various coding projects and tasks that
helped me develop my programming skills and apply theoretical
concepts to real-world problems. Here's an overview of the coding
experience:
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
def load_dataset(file_path):
try:
data = pd.read_csv(file_path)
return data
except Exception as e:
print(f"Error loading dataset: {e}")
def analyze_dataset(data):
print("Dataset Shape:", data.shape)
print("Dataset Columns:", data.columns)
print("Dataset Summary Statistics:")
print(data.describe())
def visualize_dataset(data):
plt.figure(figsize=(10, 6))
sns.heatmap(data.corr(), annot=True, cmap="coolwarm")
plt.title("Correlation Matrix")
plt.show()
def main():
file_path = "dataset.csv"
data = load_dataset(file_path)
if data is not None:
analyze_dataset(data)
visualize_dataset(data)
if __name__ == "__main__":
main()
6. SCREENSHOTS
7. CONCLUSION
Key Takeaways
1. Practical Application of Theoretical Concepts: I applied
theoretical concepts learned in academia to real-world
problems, enhancing my understanding and retention of the
material.
2. Development of Transferable Skills: I developed essential
transferable skills, including problem-solving,
communication, teamwork, and time management, which
are valuable in various professional contexts.
3. Networking Opportunities: I established connections with
professionals in the industry, potentially opening doors for
future collaborations, mentorship, or career opportunities.
4. Improved Confidence and Self-Awareness: The internship
experience boosted my confidence and self-awareness,
allowing me to better understand my strengths,
weaknesses, and career aspirations.
Future Directions
This internship experience has solidified my interest in
pursuing a career in Data Science and AIML. I am eager to
continue developing my skills, exploring new opportunities,
and making meaningful contributions to the field.
Final Thoughts
I am grateful to Ybi Foundation for providing me with this
invaluable opportunity. The experience has been
enlightening, challenging, and rewarding, and I am
confident that it will have a lasting impact on my personal
and professional growth.
8. BIBILOGRAPHY
(References that are taken to write this content)
# Books
1. Python Machine Learning by Sebastian Raschka (2015)
2. Deep Learning by Ian Goodfellow, Yoshua Bengio, and Aaron
Courville (2016)
3. Data Science Handbook by Jake VanderPlas (2016)
# Research Papers
1. "ImageNet Classification with Deep Convolutional Neural
Networks" by Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton
(2012)
2. "Long Short-Term Memory" by Sepp Hochreiter and Jürgen
Schmidhuber (1997)
# Online Resources
1. Kaggle: A platform for data science competitions and hosting
datasets.
2. UCI Machine Learning Repository: A collection of machine learning
datasets.
3. TensorFlow Documentation: Official documentation for the
TensorFlow deep learning framework.
# Articles
1. "A Beginner's Guide to Data Science" by DataCamp (2020)
2. "An Introduction to Deep Learning" by Towards Data Science
(2019)