Internship Report Front Page - Syed

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Automation of Content Management Dashboard

AN INTERNSHIP REPORT

Submitted by

Syed Abdul Khader


RRN:200071601139

in partial fulfillment for the award of the degree of


BACHELOR OF TECHNOLOGY
in
COMPUTER SCIENCE AND ENGINEERING

NOVEMBER 2023
BONAFIDE CERTIFICATE

Certified that this Internship report “AUTOMATION OF CONTENT MANAGEMENT


DASHBOARD” is the bonafide work of “SYED ABDUL KHADER (200071601139)”
who carried out the internship under my supervision. Certified further, that to the best
of our knowledge the work reported herein does not form part of any other internship
report or dissertation on the basis of which a degree or award was conferred on an
earlier occasion on this or any other candidate.

Dr. B. DHANALAKSHMI
Assistant Professor
Department of CSE
B.S. Abdur Rahman Crescent
Institute of Science and Technology
Vandalur, Chennai – 600 048.
INTERNSHIP OFFER LETTER
INTERNSHIP EXPERIENCE CERTIFICATE
INTERNSHIP COMPLETION CERTIFICATE
VIVA VOCE EXAMINATION

The viva voce examination of the Internship titled “AUTOMATION OF CONTENT

MANAGEMENT DASHBOARD”, submitted by SYED ABDUL KHADER

(200071601139) is held on ______________.

INTERNAL EXAMINER
ACKNOWLEDGEMENT

I sincerely express my heartfelt gratitude to Prof. Dr. T. MURUGESAN, Vice


chancellor and Dr. N. THAJUDDIN, Pro-Vice Chancellor, B.S. Abdur Rahman
Crescent Institute of Science and Technology, for providing me an environment to
carry out my internship successfully.

I sincerely thank Dr. N. RAJA HUSSAIN, Registrar, for furnishing every essential
facility for doing my internship.
I thank Dr. SHARMILA SANKAR, Dean, School of Computer Information and
Mathematical Sciences for her motivation and support.

I thank Dr. W. AISHA BANU, Professor and Head, Department of Computer Science
and Engineering, for providing strong oversight of vision, strategic direction and
valuable suggestions.

I obliged my class advisor Dr. B. DHANALAKSHMI, Assistant Professor, Department


of Computer Science and Engineering for her professional guidance and continued
assistance throughout the internship period.

I thank Mr. Hari Narayana Vijay Balasubramaniam (Manager) and Mr. Mahesh
Beeram Reddy (Mentor), Intellect Design Arena Ltd for their guidance and
encouragement throughout the internship period and without their guidance this
project would not have been a successful one.

I thank all the faculty members and the system staff of the Department of Computer
Science and Engineering for their valuable support and assistance at various stages
of internship completion.

(SYED ABDUL KHADER)


TABLE OF CONTENTS

CHAPTER NO. TITLE PAGE NO.


LIST OF FIGURES 8
1 COMPANY OVERVIEW
1.1 About the Company 9
1.2 Company Website 9
2 ROLE AND RESPONSIBILITY
2.1 ROLE 10
2.2 RESPONSIBILITY 10
3 PROJECT OVERVIEW 11
4 THEORETICAL ANALYSIS
4.1 Uploading files in Google Drive 12
4.2 Uploading Excel file in Google Drive 13
4.3 Uploading files from Google Sites to Google 14
Drive
4.4 Uploading Google Sheets files in Google 14
Drive

4.5 Uploading all the files in Google Drive 15


5 RESULT AND CONCLUSION 18
6 REFERENCES 19
7 APPENDIX 20
A1- Source Code 20
A2- Technical Bibliography 27
LIST OF FIGURES

FIGURE NO. TITLE PAGE NO.

1 Uploading files in Google 12


Drive
2 Uploading Excel file in 13
Google Drive
3 Uploading files from Google 14
Sites to Google Drive
4 Uploading Google Sheets 14
files in Google Drive

5 Uploading all the files in 15


Google Drive

8
CHAPTER 1
COMPANY OVERVIEW

1.1 About The Company


Intellect Design Arena, is a Polaris Group company, a global leader in Financial
Technology for Banking, Insurance and other Financial Services. A uniquely focused
Products business, Intellect addresses the needs of financial institutions in varying
stages of technology adoption.

Intellect embodies rich Intellectual Property and robust platforms & products across
Global Consumer Banking (Igcb), Central Banking, Risk & Treasury Management
(Irtm), Global Transaction Banking (Igtb) and Insurance (Intellect SEEC).
With over a decade of continuous and significant research and development
investment, the Intellect suite is the largest in the industry.

Igtb – The world’s first complete Global Transaction Banking Platform. It enables the
customer to seize the tremendous global transaction banking opportunity,
conservatively estimated at $509bn by 2021. The formidable third generation Igtb with
built-in omnichannel Corporate Banking Exchange powers the customer’s way to
Principal Banker position. Delivering the financial technology the customer has always
needed to leverage expertise and on-field innovation without constraints.

Igcb – The most advanced Consumer Banking Platform built on current technologies.
Seamless omnichannel banking with lifecycle assurance 9ptimizes first time cost of
ownership and technology running costs. Peerless productivity, at 70% lower post
implementation costs! The crown jewel is Intellect Quantum CBS, the specialist Core
Banking Solution for the unique requirements of central banks. Trusted by the central
banks of India, Seychelles, Ethiopia and now in Europe, the solution deploys a
formidable array of advanced technology frameworks. Running active balance sheets
for nations on real-time enterprise GLs, the solution enables a single source of truth,
and has a proven track record for the fastest and most cost efficient implementation.

1.2 Website

https://www.intellectdesign.com/

logo

Address

Intellect Design Arena Ltd


CIN No: L72900TN2011PLC080183
Plot No. 3/G3, SIPCOT IT Park, Siruseri,
Chennai – 600130, India.

9
CHAPTER 2
ROLE AND RESPONSIBILITY

2.1 ROLE

I applied for the role of DevSecOps Engineer. A DevSecOps Engineer Security, and
Operations, integrates security practices into the DevOps pipeline to ensure that
security is not a separate entity but an integral part of the entire software development
and delivery process.

Automation:

Identify repetitive tasks and automate them to improve efficiency.


Automate configuration management, deployment, and monitoring processes.

Collaboration and Communication:

Facilitate collaboration between development, operations, and other teams.


Promote a culture of shared responsibility and effective communication.

Cloud Management:

Manage and optimize cloud infrastructure for scalability, reliability, and cost-efficiency.
Leverage cloud services to enhance development and deployment processes.

User Access and Permissions:

Define and enforce role-based access controls for DevOps tools and environments.
Ensure proper permissions for team members based on their responsibilities.

2.2 RESPONSIBILITY

• Integrate security practices into the DevOps pipeline.


• Collaborate with development and operations teams to embed security
seamlessly.
• Implement and maintain automated security tests in the CI/CD pipeline.

10
CHAPTER 3
PROJECT OVERVIEW

The project involves creating a Python script to automate interactions with the Google
Drive API. This automation will allow users to perform various tasks such as uploading,
downloading, and managing files and folders on Google Drive programmatically.

Key Features:

Authentication:
Implement OAuth 2.0 authentication to authorize the application to access Google
Drive on behalf of the user.
Use the Google Drive API credentials to authenticate and obtain access tokens.

File Upload and Download:


Enable users to upload files from the local machine to Google Drive.
Allow users to download files from Google Drive to their local machine.

File Listing and Management:


Retrieve a list of files and folders from Google Drive.
Implement functionality to create, delete, and move files and folders.

Folder Operations:
Create a folder on Google Drive.
Move files into and out of folders.

Search and Filtering:


Implement search functionality to locate specific files based on criteria such as name,
type, or modification date.

Metadata Retrieval:
Retrieve metadata information for files and folders, such as file size, creation date,
and last modified date.

Error Handling:
Implement robust error handling to manage potential issues, such as authentication
errors or API quota limits.

Logging:
Implement logging to capture important events and errors during the execution of the
script.

Technologies Used:
Programming Language: Python
Google APIs Client Library: google-api-python-client
Authentication Library: google-auth-oauthlib
Logging Library: logging

11
CHAPTER 4
THEORETICAL ANALYSIS

4.1 Uploading files in Google Drive

Uploading files to Google Drive using Python involves leveraging the Google Drive
API to interact with the user's Google Drive account programmatically. The process
typically begins with setting up a Google Cloud Console project, enabling the Google
Drive API, and obtaining OAuth 2.0 credentials. The Python script utilizes the google-
api-python-client library for API interactions and google-auth-oauthlib for
authentication. Once authenticated, the script constructs an HTTP request to upload
the desired file, specifying the file's metadata and content. The script then sends this
request to the Google Drive API endpoint. Successful execution results in the file being
uploaded to the user's Google Drive. Proper error handling, logging, and user-friendly
documentation are crucial components of the script to ensure robustness and ease of
use. This automation provides a convenient and efficient way to manage file uploads
to Google Drive, particularly when integrated into larger workflows or applications.

12
4.2 Uploading Excel file in Google Drive

Uploading Excel file to Google Drive using Python involves utilizing the Google Drive
API to interact with the cloud storage service and the Google Sheets API to handle
spreadsheet-related operations. The process begins with setting up a Google Cloud
Console project, enabling both the Google Drive API and Google Sheets API, and
obtaining OAuth 2.0 credentials. In the Python script, the google-api-python-client
library facilitates communication with the APIs, and google-auth-oauthlib manages
authentication. After authentication, the script creates or accesses the desired Google
Drive folder and uses the Google Sheets API to export the spreadsheet data to a
temporary file. Subsequently, it utilizes the Google Drive API to upload this file to the
specified folder on Google Drive. Error handling, logging, and clear documentation are
crucial components to ensure the script's reliability and user-friendliness. This
automation proves beneficial for systematically organizing and managing Google
Sheets within Google Drive through a streamlined and programmatic approach.

save credentials.py code output screenshot:

13
4.3 Uploading files from Google Sites to Google Drive

To upload files from Google Sites to Google Drive using Python, you need to utilize
the Google Drive API and, if necessary, the Google Sites API. The process involves
setting up a Google Cloud Console project, enabling the required APIs, obtaining
OAuth 2.0 credentials, and using Python to authenticate your application. Once
authenticated, you can retrieve the files from Google Sites using the Google Sites API
or another method.

To perform the actual upload, you'll use the Google Drive API. Specify the file's
metadata, such as its name and the parent folder's ID, and provide the file's content.
This typically involves creating a MediaFileUpload object with the file's path and MIME
type. The Python script sends this information to the Google Drive API endpoint,
resulting in the file being uploaded to the specified folder in Google Drive.

It's crucial to implement error handling to manage potential issues during the upload
process, such as network problems or insufficient permissions. Additionally, logging
can be implemented to capture important events and errors for later review.

This automation process simplifies the manual effort required to move files from
Google Sites to Google Drive, making it more efficient and programmatically
accessible. The script provides a structured and repeatable way to handle this file
transfer, enhancing the overall management of digital assets within a Google
Workspace environment.

Output:

4.4 Uploading Google Sheets to Google Drive

Uploading Google Sheets to Google Drive using Python involves utilizing the Google
Drive API to interact with the cloud storage service and the Google Sheets API to
handle spreadsheet-related operations. The process begins with setting up a Google
Cloud Console project, enabling both the Google Drive API and Google Sheets API,
and obtaining OAuth 2.0 credentials. In the Python script, the google-api-python-client
library facilitates communication with the APIs, and google-auth-oauthlib manages
authentication. After authentication, the script creates or accesses the desired Google
Drive folder and uses the Google Sheets API to export the spreadsheet data to a
temporary file. Subsequently, it utilizes the Google Drive API to upload this file to the

14
specified folder on Google Drive. Error handling, logging, and clear documentation are
crucial components to ensure the script's reliability and user-friendliness. This
automation proves beneficial for systematically organizing and managing Google
Sheets within Google Drive through a streamlined and programmatic approach.

Code output Screenshot:

4.5 Uploading all the files in Google Drive


Uploading all files to Google Drive using Python involves leveraging the Google Drive
API to interact with the cloud storage service programmatically. Initially, you need to
set up a Google Cloud Console project, enable the Google Drive API, and obtain
OAuth 2.0 credentials for authentication. With these credentials, your Python script
can authenticate and obtain the necessary access tokens. Subsequently, the script
traverses through the local file system, identifies each file, and utilizes the Google
Drive API to upload them one by one. This process includes specifying the metadata
for each file, such as its name and parent folder ID, and providing the file's content for

15
upload. The script should also handle potential errors during the upload process,
ensuring robustness and reliability. Incorporating logging mechanisms helps capture
relevant events and errors for later analysis. Comprehensive documentation should
accompany the script, explaining the setup process, authentication, and usage
instructions. Automating the upload of all files to Google Drive streamlines the
management of digital assets, making it particularly useful for backup or organizational
purposes.
Output Screenshot:

16
17
CHAPTER 6
RESULTS AND CONCLUSION
RESULTS
Hence the Google Drive API to upload various file types, including a text file, image
file, Excel file, and a Google Sheets file. Each snippet begins with authentication,
utilizing OAuth 2.0 credentials, followed by the creation of file metadata specifying the
name and parent folder in Google Drive. The Media File Upload class is then employed
to handle the actual file upload, with the script printing the unique identifier (ID)
assigned to the uploaded file. The text file and image file examples use their respective
MIME types, while the Excel file and Google Sheets file examples include MIME types
for Excel and CSV formats, respectively. These code snippets provide a foundation for
automating the upload of diverse file types to Google Drive using Python. Remember
to replace placeholders such as the credentials file path and folder IDs with your
specific values for successful execution.
CONCLUSION
In conclusion, the Python code presented showcase the versatility of the Google Drive
API for uploading different file types to Google Drive. By leveraging OAuth 2.0
credentials and the appropriate MIME types, the scripts authenticate with Google Drive
and efficiently handle the upload process for text files, image files, Excel files, and
Google Sheets files. These examples serve as practical starting points for developers
looking to integrate file uploading functionality into their applications or automation
workflows. The modularity of the provided code allows for easy adaptation and
extension to accommodate various file formats and use cases. When implementing
these scripts, it's crucial to ensure accurate MIME types, replace placeholders with
actual values, and consider error handling for a robust and reliable solution. Overall,
these Python snippets empower users to automate the uploading of diverse file types
to Google Drive, contributing to streamlined file management within the Google
Workspace ecosystem.

18
REFERENCES
https://youtu.be/3wC-SCdJK2c?si=taJB5lBn23b4AdI3

https://developers.google.com/sheets/api/quickstart/python

https://blog.finxter.com/automate-backup-to-google-drive-with-python/

https://www.projectpro.io/recipes/upload-files-to-google-drive-using-
python#mcetoc_1g02b3q8jbc

https://www.geeksforgeeks.org/how-to-automate-google-sheets-with-python/

https://thepythoncode.com/article/using-google-drive--api-in-python

19
APPENDIX
Source Code:
Python code:
import os
import os.path

from google.auth.transport.requests import Request


from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
from googleapiclient.http import MediaFileUpload

SCOPES = ["https://www.googleapis.com/auth/drive"]

Creds = None

if os.path.exists("token.json"):
Creds = Credentials.from_authorized_user_file(
"token.json", SCOPES)

if not Creds or not Creds.valid:


if Creds and Creds.expired and Creds.refresh_token:
Creds.refresh(Request())
else:
flow=InstalledAppFlow.from_client_secrets_file(
"credentials.json", SCOPES)
Creds = flow.run_local_server(port=0)

with open("credentials.json","w") as token:


token.write(Creds.to_json())

try:
service = build("drive", "v3", credentials=Creds)

response = service.files().list(
q="name='BackupFolder2022' and mimeType='application/vnd.google-
apps.folder'",
spaces="drive"
).execute()

if not response["files"]:
file_metadata = {
"name": "BackupFolder2022",
"mimeType": "application/vnd.google-apps.folder"
}

20
file = service.files().create(body=file_metadata,
fields="id").execute()

folder_id = file.get("id")
else:
folder_id = response["files"][0]["id"]

backup_files_dir = "backupfiles"
if os.path.isdir(backup_files_dir):
for file in os.listdir(backup_files_dir):
file_path = os.path.join(backup_files_dir, file)
if os.path.isfile(file_path):
file_metadata = {
"name": file,
"parents": [folder_id]
}

media = MediaFileUpload(file_path)
upload_file = service.files().create(body=file_metadata,
media_body=media, fields="id").execute()

print("Backed up file:", file)


else:
print("Skipped non-file item in backupfiles directory:", file)
else:
print("Backup files directory not found:", backup_files_dir)

except HttpError as e:
print("Error:", str(e))

Credentials.Json code:
import json

credentials = {
"client_id": "863126694652-
90gtgjf2pe3apcll2g838n9dckoqfv5l.apps.googleusercontent.com",
"project_id": "mystical-binder-390009",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url":
"https://www.googleapis.com/oauth2/v1/certs",
"client_secret": "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",
"redirect_uris": ["http://localhost"]
}

try:
with open("credentials.json", "w") as file:
json.dump(credentials, file, indent=4)

21
print("Credentials saved successfully.")
except IOError as e:
print(f"An error occurred while saving credentials: {e}")

Token.py code:
import requests

# Define the required parameters


client_id = "863126694652-
90gtgjf2pe3apcll2g838n9dckoqfv5l.apps.googleusercontent.com"
client_secret = "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"
redirect_uri = ["http://localhost"]
authorization_code = "4/0AbUR2VNyjy656-7n5cywOqQIAGWOs-
oHKglvlloWvWbCjl95G6JM3Tj5zTaXWHEf1163AA"

# Prepare the request payload


payload = {
"code": authorization_code,
"client_id": client_id,
"client_secret": client_secret,
"redirect_uri": redirect_uri,
"grant_type": "authorization_code"
}

# Make the POST request to the token endpoint


token_url = "https://oauth2.googleapis.com/token"
response = requests.post(token_url, data=payload)

# Check the response status


if response.status_code == 200:
# Request was successful
token_data = response.json()
# Extract the access token and refresh token from the response
access_token = token_data["access_token"]
refresh_token = token_data["refresh_token"]

# Save the tokens to your token.json file or use them as needed


# ...

else:
# Request failed
print("Token request failed:", response.status_code, response.text)

22
Google Drive to Google Sites API code:
import os
from google.oauth2 import service_account
from googleapiclient.discovery import build

# Set up credentials
credentials_path = 'credentials service account.json'
SCOPES = ['https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/sites']

credentials =
service_account.Credentials.from_service_account_file(credentials_path,
scopes=SCOPES)

# Create Drive and Sites service instances


drive_service = build('drive', 'v3', credentials=credentials)
sites_service =build('sites', 'v1', credentials=credentials)

# Define the file and site information


file_id = '1FUGfQEXHKPvBmIxSpQ81wQG9FQVf2HGwF-pbz-zLz2g' # Replace with the
ID of the file in Google Drive
site_name = 'google sheets API' # Replace with the name of the target Google
Site
attachment_name = 'Google site' # Name for the attachment in Google Sites

# Get the file metadata from Google Drive


file = drive_service.files().get(fileId=file_id).execute()

# Download the file from Google Drive


file_path = 'temp/' + file['google sheets API.xlsx'] # Specify the local path
to save the downloaded file
request = drive_service.files().get_media(fileId=file_id)
fh = open(file_path, 'wb')
downloader = MediaIoBaseDownload(fh, request)
done = False
while done is False:
status, done = downloader.next_chunk()

# Upload the file to Google Sites


site = sites_service.sites().get(siteName=site_name).execute()
site_content_path = site['site']['siteContent']['content']['name']
attachment_url = f"sites/{site_content_path}/attachments/{attachment_name}"
attachment_file_path = os.path.abspath(file_path)
media = MediaFileUpload(attachment_file_path)
attachment = sites_service.sites().attachments().create(parent=attachment_url,
media_body=media).execute()

# Delete the temporary file

23
os.remove(file_path)

print('File uploaded successfully to Google Sites.')

Code for Google Sheets API using python:


import os
import time
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError

SCOPES = ["https://www.googleapis.com/auth/spreadsheets"]
SPREADSHEET_ID = "1FUGfQEXHKPvBmIxSpQ81wQG9FQVf2HGwF-pbz-zLz2g"

def main():
credentials = None

if os.path.exists("token.json"):
credentials = Credentials.from_authorized_user_file("token.json",
SCOPES)

if not credentials or not credentials.valid:


if credentials and credentials.expired and credentials.refresh_token:
credentials.refresh(Request())
else:
flow =
InstalledAppFlow.from_client_secrets_file("credentials.json", SCOPES)
credentials = flow.run_local_server(port=0)

with open("token.json", "w") as token:


token.write(credentials.to_json())

while True:
try:
service = build("sheets", "v4", credentials=credentials)
sheets = service.spreadsheets()

for row in range(2, 8):


num1 = int(sheets.values().get(spreadsheetId=SPREADSHEET_ID,
range=f"Sheet1!A{row}").execute().get("values")[0][0])
num2 = int(sheets.values().get(spreadsheetId=SPREADSHEET_ID,
range=f"Sheet1!B{row}").execute().get("values")[0][0])
calculation_result = num1 + num2
print(f"Processing {num1} + {num2}")

24
sheets.values().update(
spreadsheetId=SPREADSHEET_ID,
range=f"Sheet1!C{row}",
valueInputOption="USER_ENTERED",
body={"values": [[f"{calculation_result}"]]}
).execute()

sheets.values().update(
spreadsheetId=SPREADSHEET_ID,
range=f"Sheet1!D{row}",
valueInputOption="USER_ENTERED",
body={"values": [["Done"]]}
).execute()

# Delay between each iteration (e.g., 1 minute)


time.sleep(60)

except HttpError as error:


print(f"An error occurred: {error}")

if __name__ == "__main__":
main()

Python code for uploading all the files in a single folder:


import os
from googleapiclient.discovery import build
from google.oauth2.credentials import Credentials
from googleapiclient.http import MediaFileUpload

# Set up Google Drive API credentials


credentials = Credentials.from_authorized_user_file('token.json')
drive_service = build('drive', 'v3', credentials=credentials)

# Define the folder ID where you want to upload the files


folder_id = '1PKbt-0rKzZcpRglRX-NHh-sFF6xsIvMy'

# Retrieve a list of files to upload (change the directory path as per your
requirement)
directory = 'C:/Users/syed'

# Filter files based on extensions (modify this list as needed)


allowed_extensions = ['.jpg', '.png', '.txt']

# Get all files in the directory with allowed extensions


files = [f for f in os.listdir(directory) if
os.path.isfile(os.path.join(directory, f)) and os.path.splitext(f)[1].lower()
in allowed_extensions]

25
# Upload each file
for file_name in files:
file_path = os.path.join(directory, file_name)

# Create file metadata


file_metadata = {
'name': file_name,
'parents': [folder_id]
}

# Upload the file


media = MediaFileUpload(file_path)
file = drive_service.files().create(body=file_metadata, media_body=media,
fields='id').execute()
print(f'Uploaded file: {file_name} (ID: {file["id"]})')

26
Technical Bibliography:
Hello! I'm Syed Abdul Khader, Pursuing B.Tech Computer
Science and Engineering at B S Abdur Rahman Crescent
Institute of Science and Technology, Chennai. With a passion
for DevSecOps, I have dedicated my time to Automation of
CM Dashboard using Google Drive API. My professional
journey has included experiences in Intellect Design Arena
Ltd, where I demonstrated a commitment to finish the work by
doing an automation. Outside of work or academics, you
might find me playing Badminton and listening to music, as I
believe in maintaining a balance between professional growth
and personal well-being. I am excited about my placement or
studying further for Post Graduate, and I look forward to
exploring opportunities that allow me to contribute to this IT
World.

27

You might also like