Utkarsh Resume

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

Utkarsh

-

PROFESSIONAL SUMMARY
 Senior Python Developer with 5+ years of experience in developing and deploying cutting-edge software solutions using
Python programming language. Capable of creating dynamic and responsive front-end user interfaces using HTML5, CSS3,
React, and JavaScript,
 Adept in Django framework for developing feature-rich web applications with optimal performance and working with MySQL
and MongoDB databases for efficient data storage and retrieval.
 Expertise in developing and deploying Python applications on Linux servers to build and maintain scalable and reliable
Python applications in a Linux environment.
 Specialized in Time Series data and Market Risk data engineering, with a focus on designing and implementing end-to-end
data pipeline and proficient in handling large-scale data sets and ensuring data integrity and accuracy.
 Proficient in designing and building software applications using OOPS principles to create modular, maintainable, and
scalable code.
 Experienced with AWS services such as EC2, DynamoDB, Lambda, and S3, ensuring scalable cloud solutions and skilled in
integrating FAST and SOAP APIs for enhancing application functionality and user experience.
 Skilled in using AWS Code Pipeline, CloudWatch, and Redshift for ensuring secure and optimized cloud architecture.
 Proficient in microservices architecture, Kafka, and Pyspark for efficient data processing and analytics. Expert in Git and
Jenkins for version control and continuous integration, deploying with Kubernetes and Docker.
 Adept with strong testing skills using Pytest and Selenium, ensuring high-quality and reliable code and proficient in utilizing
data science libraries like Pandas, Numpy, PyTorch, BOTO3, and Pyspark for advanced data analysis and machine learning.
 Proficient in utilizing popular web frameworks such as Django Rest Framework or Flask to develop robust and efficient
RESTful APIs.
 Experienced in using Matplotlib to create data visualizations to create charts, graphs, and other visualizations of data.
Familiar with the different types of plots that can be created with Matplotlib.
 Expertise in using Scikit-learn for machine learning tasks to build and train machine learning models that are accurate,
efficient, and scalable.
 Expertise in using multiprocessing to parallelize tasks, familiarity with multithreading to improve the performance of single-
threaded applications.
 Skilled in Terraform for managing infrastructure as code for scalability and consistency. Expert of Agile methodologies,
fostering collaboration and accelerated development cycles.
 Exceptional communication and teamwork skills, promoting seamless cross-functional interactions. Creative problem solver
with strong analytical abilities for tackling complex challenges.
 Highly organized with efficient time management, consistently meeting deadlines and enthusiastic about innovation,
attention to detail, and delivering exceptional results.
 Well-versed in the intricacies of the Software Development Life Cycle (SDLC), ensuring that projects are meticulously
planned, executed, and delivered.

WORK EXPERIENCE
Wells Fargo| SENIOR PYTHON ENGINEER MARCH 2021 – PRESENT

Worked as senior python developer in the product engineering department for developing an innovative and
secure Fraud detection Django based application for accomplishing the mission of making financial wellness
tools more accessible to US workers by utilizing Amazon Web services (AWS) resulting in providing a safe and
simple option through which employees without a traditional bank account can access their pay, giving them
freedom in spending their wages.

 Designed and developed the backend infrastructure using Django, creating RESTful APIs to facilitate data retrieval and
manipulation and Utilized MySQL queries to extract and aggregate data from multiple sources, providing meaningful insights
for visualization.
 Leveraged Power BI to design and develop an interactive data analytics dashboard that provided actionable insights and real-
time visualizations and utilized Power BI's intuitive interface to connect to diverse data sources, including databases and
APIs.
 Designed and developed an interactive data analytics dashboard leveraging strong Object-Oriented Programming (OOP)
principles and utilized Python's OOP capabilities to architect a modular and extensible codebase.
 Leveraged React, HTML5, CSS3, and JavaScript to build a responsive and user-friendly front-end interface for the data
analytics dashboard and Collaborated with UX/UI designers to ensure a visually appealing and intuitive user experience,
while optimizing page load times.
 Utilized python libraries such as Pandas and Numpy to preprocess and clean raw data, ensuring accuracy and reliability for
subsequent analysis and implemented dynamic visualizations using Matplotlib, crafting interactive charts and graphs for data
trend analysis.
 Designed and implemented a Linux-based dashboard to monitor and visualize system performance using Prometheus,
Grafana, and Elasticsearch.
 Designed and implemented RESTful APIs as part of a fraud detection system, focusing on modularity and scalability to
accommodate evolving fraud patterns.
 Design and implement data engineering solutions for the Trade Floor, focusing on Time Series data and Market Risk ensuring
the efficient extraction, transformation, and loading (ETL) of large-scale financial datasets.
 Utilized multithreading to improve the responsiveness of the dashboard by running different parts of the dashboard on
different threads and used Matplotlib to integrate with other Python libraries, such as Pandas and NumPy, to make it easier
to manipulate and visualize data.
 Integrated the backend with AWS services such as EC2, Cloudfront, DynamoDB, and Lambda, ensuring seamless data storage,
retrieval, and serverless functions and utilized AWS Glue to automate data extraction, transformation, and loading processes,
maintaining data accuracy and freshness.
 Set up and configured Linux environments for Python development, ensuring seamless integration of development tools and
libraries.
 Implemented microservices architecture to enhance scalability and modularity, allowing for future feature expansion and
integrated Kafka to handle real-time data streaming, enabling immediate updates to the dashboard's analytics.
 Ensured code quality and stability through version control using Git, enabling collaboration and tracking changes effectively
and automated deployment pipelines using Jenkins for enabling continuous integration and ensuring rapid iteration of the
dashboard.
 Implemented CI/CD practices by setting up automated testing (unit tests and integration tests) using unittest, ensuring
robustness and reliability. Orchestrated deployments using Kubernetes and Docker, enhancing portability and scalability of
the dashboard.
 Integrated Jira with Git branches, allowing for real-time visibility into the status of tasks and user stories, improving
communication across the development team.
 Managed infrastructure as code using Terraform, allowing for consistent and repeatable deployment across environments
and incorporated Pyspark for advanced data processing and analysis, enhancing the dashboard's capabilities for handling
large datasets.
 Worked in alignment with Agile methodologies, participating in sprint planning, daily stand-ups, and retrospectives for
efficient project management and Aligned development processes with the Software Development Life Cycle (SDLC)
methodologies, ensuring structured and well-organized project execution.

United Health Group | PYTHON/ AWS ENGINEER JUNE 2019 – FEB 2021

Worked as a Python Developer for United Health Group by enhancing an existing united health application to
improve user experience, increase conversion rates and implementing best practices for reliability, security, and
performance optimization resulting in helping the company to save developer time, improved reliability, and
reduced costs.
 Utilized Python to implement dynamic functionality, ensuring seamless interaction and responsiveness across different parts
of the platform and developed responsive front-end features using React, CSS3, and HTML5, javascript for enhancing user
experience and optimizing the platform for mobile and desktop users.
 Designed and developed robust backend components using Django, a micro web framework, to power the platform's core
functionalities to create RESTful APIs that facilitated seamless communication between the frontend and various
microservices.
 Integrated SOAP APIs to connect the platform with external services, enriching the experience with real-time data updates
and leveraging MySQL databases, such as Amazon DynamoDB, to optimize data storage and retrieval, enhancing platform
performance.
 Integrated SAML-based Single Sign-On (SSO) authentication into the United Health application, ensuring secure access to
sensitive health-related data.
 Enhanced and expanded existing RESTful APIs to support new features and functionalities within the United Health
application, contributing to an enriched user experience.
 Utilized Git commit hooks to link code changes directly to Jira issues, enhancing traceability and providing a comprehensive
view of progress within the project.
 Orchestrated cloud services with AWS, including EC2, Lambda, and S3 buckets, ensuring high availability, scalability, and
secure data storage and configured AWS CodePipeline to automate deployment workflows, allowing for seamless integration
of new features and updates.
 Implemented logging and monitoring using AWS CloudWatch, enabling real-time tracking of application performance and
proactive issue resolution and conducted thorough testing using Pytest and Selenium, ensuring code quality, functional
correctness, and a seamless user experience.
 Integrated Kafka to handle real-time data streaming, enabling immediate updates and notifications for users and managed
source code using Git and bitbucket, ensuring version control and promoting effective collaboration among developers.
 Integrated CI/CD practices using Kubernetes and Docker, ensuring consistent and reliable deployments across different
environments and set up continuous integration workflows with Jenkins, automating testing and deployment processes for
rapid iteration and efficient development.
 Utilized Linux shell scripting for task automation, enhancing efficiency in routine development tasks, and contributing to a
more streamlined workflow.
 Managed infrastructure as code using Terraform, enabling scalable and repeatable deployments and Leveraged python
libraries such as PyTorch, BOTO3, and Pyspark to enhance data analysis capabilities, enabling personalized user
recommendations and advanced insights.

HCL TECHNOLOGIES | PYTHON ENGINEER MARCH 2018 – MAY 2019


 Designed and developed the backend of the application using Python and Flask, creating robust APIs to facilitate seamless
data transfer and transformation.
 Leveraged HTML, CSS, and JavaScript to develop a user-friendly front-end interface, allowing users to initiate and monitor
data migration processes.
 Utilized AWS services such as S3, Lambda, and EC2 to design a scalable and secure infrastructure for storing, processing, and
managing data during migration.
 Integrated SQL databases to handle data storage and retrieval, ensuring data integrity and consistency throughout the
migration process.
 Implemented SOAP APIs to establish communication with external systems, enabling data extraction and integration from
diverse sources.
 Configured and maintained Git repositories, establishing a robust version control system for tracking code changes and
enhancements in alignment with Jira tasks.
 Set up a robust CI/CD pipeline using Kubernetes and Docker, automating testing and deployment processes for consistent
and reliable application updates.
 Developed comprehensive unit tests using PyUnit to ensure the correctness of the application's functionalities and minimize
potential issues.
 Utilized microservices architecture to modularize components, promoting scalability and maintainability as the application
evolves.
 Integrated Kafka for real-time data streaming, enabling efficient handling of large datasets during migration. Managed
infrastructure as code using Terraform, allowing for consistent and repeatable deployment and scaling of resources.
 Utilized Git for version control, maintaining a well-documented codebase and facilitating collaborative development with
other team members.
 Implemented containerization using Docker on Linux for packaging and distributing Python applications, ensuring consistency
across various environments.

EDUCATION DETAILS
 Master of Science (M.S.) in Computer Science University of Houston Clear-Lake Texas, USA in 2023.
 Bachelor of Science (B.Sc.) in Electrical Engineering from Institute of Technology Nirma University Gujarat, India in
2019.

You might also like