Dakxish Python

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Dakxesh P

Sr. Python Expert


USC/NJ
862-803-0696
dakxishpython@gmail.com
PROFESSIONAL SUMMARY:
 10+ years of professional experience as a Sr. Python Developer specializing in the design, development, and
implementation of Python-based applications using frameworks such as Django, Flask, and client-server technologies.
 professional experience in developing, implementation and optimization of data plumbing systems and ETL processes.
 Involved in entire lifecycle of the projects including Design, Development, and Deployment, Testing and Implementation
and support.
 Experience in Agile/Scrum Methodology with high quality deliverables delivered on time.
 Experienced in IaaS, PaaS, and SaaS platforms such as AWS, Azure, and Google Cloud.
 Experience in Machine learning models using Nvidia GPUs for accelerated computing.
 Expertise in Object-Oriented design and coding. Good knowledge of various Design Pattern and UML.
 Experience with Web Development, Web Services, Python, and the Django/flask framework
 Experienced in processing web applications and establishing Model View Template (MVT) architecture using Python's
Django web application framework.
 Well versed in writing SOAP, RESTful API web services for web claim applications.
 Participated in continuous improvement initiatives, staying updated with the latest developments in both Golang and
Python and sharing knowledge with team members.
 Monitored and reported on Chatbot performance metrics, leading to continuous improvements and updates.
 Experience developing highly interactive web applications utilizing HTML5, CSS, JSON, AngularJS, and Bootstrap and
integrating Restful API's.
 Proficient in SQL databases MS SQL, MySQL, Oracle and No-SQL databases like MongoDB, Cassandra.
 Experienced with writing custom queries through database connectors.
 Planned and managed the development of a React component library, defining project timelines, milestones, and
deliverables.
 Experience in working with Python ORM Libraries including Django ORM, SQL Alchemy.
 Experience in strategy and practical implementation of AWS Cloud-specific technologies include EC2, EBS, S3, ELB, VPC,
AMI, IAM, ECS, Route53, RDS, Lambda, Dynamo DB, CloudWatch, SNS, Config, CloudTrail, Beanstalk and cloud
formation templates.
 Business logic implementation, data exchange, XML processing and graphics creation have been done using Python and
Django.
 Experience in building database Model, APIs and Views utilizing Python, to build an interactive web-based solution.
 Developed a fully automated continuous integration system using Git, Jenkins, MySQL, and custom tools developed in
Python and Bash.
 Experience developing highly interactive web applications utilizing HTML5, CSS, JSON, AngularJS, and Bootstrap and
integrating Restful API's.
 Experienced in integration of various relational and non-relational sources such as DB2, Oracle, SQL Server, NoSQL-
MongoDB, XML and Flat Files, to Netezza database.
 Hands-on Experience in Data Management, Data Security, Data Modeling, Workflow Automation, Formulas &
Validations, Chatter.
 Experience in building frameworks and automating complex workflows using Python for Test Automation.
 Experienced in building/maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT and Docker on
Google Cloud Platform (GCP)
 Working with backend python automation, Docker, and cloud provisioning/automation.
 Automated different workflows, which are initiated manually with Python scripts and Unix shell scripting.
 Experienced in various types of testing such as Unit testing, Integration testing, User acceptance testing, Functional
testing.
 Experienced in developing applications using Amazon web services like EC2, Cloud Search, Elastic Load balancer ELB,
S3, Cloud Front, Route 53.
TECHNICAL SKILLS

Programming Languages: Python 2.X, Python 3.X, Java, Golang


Python Web Frameworks: Django, Flask, Pandas, scikit-learn, NumPy.
Python Packages: Scikit-learn, NumPy, Pandas, Django, Flask, Matplotlib, PySpark, NLTK,
TensorFlow, PyTorch, MATLAB.
Machine learning: Classification, Regression, Clustering, NLP, Deep Learning.
Bigdata Technologies: Spark, Scala, Hadoop, Hive, Cassandra, PySpark, MapReduce, Apache Kafka,
Pig, YARN, Sqoop.
Cloud: AWS, GCP, Azure.
Web Technologies: HTML, DHTML, React JS, jQuery, XML, CSS, JSON, Bootstrap, NodeJS.
Databases: MySQL, PLSQL, MSSQL, PostgreSQL.
Web Services: SOAP, REST.
BI tools: Tableau, Data Cleaning, Data Blending, ETL, Data Wrangling, Data Mining.
Editors/IDEs: PyCharm, PyDev Eclipse, Jupiter, Visual studio.
Testing: Junit, pytest, TDD, BDD, POSTMAN, Manual testing.
Methodologies: Agile, Waterfall, SCRUM.
Version Control: SVN, Git.
Build/CI Tools: Maven, Log4j, Jenkins, Kubernetes.
Operating Systems: Windows, Linux, UNIX, MacOS.

RELEVANT EXPERIENCE:
Client Name: Marsh Financial Services Nebraska (Remote) Dec 2023 – Till Date
Role: Senior Python Developer (70% Backend and 30% front end)
Responsibilities:
 Participated in the analysis, design, and development phase of the Software Development Lifecycle (SDLC) in an agile
environment. Conducted sprint planning, scrum calls, and retrospective meetings for each sprint. Utilized Git for version
control and managed project activities using JIRA and Confluence.
 Implement Infrastructure as Code (IaC) using tools like Terraform or AWS CloudFormation to automate infrastructure
provisioning and management.
 Developed dynamic, responsive user interfaces using Vue.js for various web applications.
 Developed software applications that leverage Nvidia GPU capabilities for enhanced performance.
 Developed with cross-functional teams, including product managers and data scientists, to integrate LLM capabilities into
existing systems.
 Developing MDBs for receiving and processing data from Apache Kafka message queue.
 Using Apache Kafka to distribute publish-subscribe messaging system in order to be designed to be fast, scalable, and
durable.
 Implemented Storm topologies to process the high volume of data from various Kafka topics and writing to various Kafka
topics.
 Worked on Solr, Elastic Search, Kafka, Flume, MongoDB, Cassandra and RabbitMQ.
 Experience in developing Apache Kafka as a distributed publish-subscribe messaging system and a robust queue that can
handle a high volume of data.
 Implemented Vue.js components, Vuex for state management, and Vue Router for single-page applications (SPA).
 Developed Amazon Web Services (AWS) and Microsoft Azure, such as AWS EC2, S3, RD3, Azure HDInsight, Azure
Storage, and Azure Data Lake, Azure Data Factory (ADF) etc.
 Integrated various data sources like MySQL with Grafana to provide comprehensive insights.
 Implemented and managed data collection processes, integrating various metrics and logs into Grafana for unified
visualization.
 Developed and maintained Grafana dashboards for real-time monitoring and visualization of key performance metrics.
 Involved in the development of real time streaming applications using Pyspark, Apache Spark, Kafka, Hive on distributed
Hadoop Cluster.
 Closely worked with Kafka Admin team to set up Kafka cluster setup on the QA and Production environments.
 Had knowledge on Kibana and Elastic search to identify the Kafka message failure scenarios.
 Conducted research on advanced AI models using Nvidia GPUs for experimentation.
 Integrated Hyperflex APIs with existing infrastructure management tools to streamline operations.
 Designed and developed Python-based RESTful Web Services for seamless data exchange with external interfaces using
JSON format. Leveraged Django and PostgreSQL for data analysis and management.
 Created web-based applications using a diverse tech stack including Python, Django, HTML5, XML, JSON, CSS3 and
JavaScript.
 Engineered applications by integrating multiple technologies, such as Python, Django, Pandas, REST, and SQL.
 Developed RESTful web services and APIs, following best practices, and ensuring high-quality code.
 Design and implement scalable backend services using Golang, focusing on high performance and low latency.
 Configured SaaS applications to meet the specific needs of the organization, including setting up workflows, integrations,
and custom fields.
 Developed Spark jobs in Py Spark to perform ETL from SQL server to Hadoop.
 Used Python scripts to monitor AI model performance and implement optimizations, ensuring high accuracy and relevance
in responses.
 Developed comprehensive test plans and conducted integration, system, and regression testing for the React library.
 Experienced Fast API as it combines the simplicity and ease of use of Python with the performance and validation
capabilities required for building modern web APIs.
 Developed Python scripts and applications to interact with Cassandra databases, ensuring efficient data retrieval and
manipulation.
 Developed interactive and feature-rich dashboards using technologies like Python, Java, Bootstrap, CSS, JavaScript,
and jQuery. Employed Machine Learning algorithms in Python (Scikit-Learn) to analyze and format data.
 Designed and implemented Python Django forms to capture user data, employing PyTest for comprehensive test case
coverage. Customized SQL queries, Functions, Cursors, and Triggers to fulfill specific client requirements.
 Collaborated with cross-functional teams to integrate Cassandra into Python-based microservices architecture, ensuring
seamless data operations.
 Proficiently utilized Python, Jupyter, and the Scientific computing stack (NumPy, SciPy, pandas, matplotlib) to design
and develop Python/Django frameworks for REST services.
 Utilized SQL queries and ORM frameworks like SQL Alchemy to efficiently retrieve and manipulate data within Python
applications.
 Used Fast API to build-in support for common authentication mechanisms such as OAuth2 and JWT.
 Developed and deployed diverse web applications using Python and Django. Established RESTful APIs to enable smooth
frontend-backend communication, ensuring seamless data flow.
 Collaborated closely with UI/UX designers and backend teams to create visually appealing, user-friendly web interfaces for
optimal user experiences.
 Integrated the Chatbot with CRM systems to provide context-aware responses, enhancing the overall customer experience.
 Develop RESTful APIs and Microservices with Golang, ensuring seamless integration with Python-based services.
 Proficiently set up and maintained Linux-based web servers, effectively managing various server-side tasks.
 Optimized relational databases (MySQL, PostgreSQL) for data integrity, security, and high availability. Crafted complex
SQL queries and performed optimizations for improved application performance.
 Contributed to microservices development in a Continuous Delivery environment using Docker and Jenkins. Enhanced
network mapping microservices using Python and deployed on Kubernetes.
 Managed Docker containers through Docker files, automated builds on Docker Hub, and Kubernetes setup for automated
deployment, scaling, and operations across clusters.
 Analysed the SQL scripts and designed it by using PySpark SQL for faster performance.
 Integrated microservices with CI/CD processes, developing RESTful microservices with Django and deploying on AWS
servers (EBS and EC2).

Environment: Python 3.7, Flask, HTML, JavaScript, Scala, Nvidia, Kafka, Grafna, Oracle 12c, AWS, Casandra, Unit Test, PyTest,
GitHub, JIRA, CNN, Tensor Flow, Keras, Snowflake, PyQuery, HTML5, CSS3.

Progressive Insurance – Mayfield, Ohio Sep 2021 - Nov 2023


Sr. Python/ Django Developer
Responsibilities:
 Involved in the project life cycle including design, development and implementation and verification and validation.
 Extensively utilized Python frameworks like Django, Flask, PyUnit, and libraries like matplotlib.
 Gathered design requirements from the client, analysed, and provided solutions and met design requirements.
 Updated and manipulated content and files by using python scripts.
 Created logic using Python programming language to node, functions, and pipelines to retrieve and manipulate data.
 Worked on Python OpenStack APIs and used NumPy for Numerical analysis.
 Integrated RESTful APIs with Vue.js applications, handling asynchronous data fetching and updating UI accordingly.
 Implemented improvements to enhance the usability, performance, and accuracy of Grafana visualizations.
 Developed Python scripts to interact with Grafana's HTTP API for automated dashboard creation and updates.
 Implemented Pandas, Pandas Data Frame query, NumPy Python libraries, performed complex table joins, extractions and
transformation of financial data retrieved from NetSuite and migrated to SAP.
 Implemented ETL pipeline to source data files and cross reference tables from AWS S3 and perform data conversion using
logic implemented in Python.
 Develop programs to automate the testing of controller in CI/CD environment using Python, Java, Bash script, Git, Linux
command line, Java Script.
 Integrated Nvidia SDKs into software solutions.
 Implemented communication layers between Golang and Python services, ensuring efficient data exchange and
processing.
 Optimized code for parallel processing on Nvidia GPUs.
 Utilized Python libraries such as Pandas and NumPy to analyze data stored on HyperFlex systems.
 Designed and implemented Python-based monitoring tools to ensure the optimal performance of the HyperFlex systems.
 Performed troubleshooting and resolved issues related to data replication, consistency, and performance bottlenecks in
Cassandra databases.
 Integrated Platform as a services (PaaS) like databases, messaging services, and storage solutions into applications.
 Worked on reading and writing multiple data formats like JSON, ORC, Parquet on HDFS using Spark.
 Worked on the Deployment, Configuration, Monitoring and Maintenance of OpenShift Container Platform.
 Write Python code to simplify lists in Python with list comprehension and Python OOP. Worked with POSTMAN for API
testing. Developed REST APIs and created User Model for application.
 Copied data from AWS S3 into Redshift using SQL and performed text mining and analysis using DBeaver.
 Performed the migration of large data sets to Databricks (Spark), create and administer cluster, load data, configure data
pipelines, loading data from ADLS Gen2 to Databricks using ADF pipelines.
 Developed application which accessed the Cloud foundry API to monitor trends in development environments using other
CF tools: Jenkins, Chef, Puppet.
 Automated the configuration and scaling of HyperFlex clusters using Python-based Ansible playbooks.
 Ensured data integrity, perform data migration, and manage data backups within SaaS applications.
 Having Knowledge on AWS Lambda, Auto scaling, Cloud Front, RDS, Route53, AWS SNS, SQS, SES.
 Fetched twitter feeds for certain important keyword using the python-twitter library (Tweepy), Used JSON to store the twitter
data which was further represented using matplotlib visualization and generated graphical reports for business decision
using matplotlib library.
 Deployed Golang applications on cloud platforms such as AWS, Azure, or Google Cloud, ensuring compatibility with
Python-based systems.
 Wrote queries using SQL in DBeaver to load data from AWS S3, transfer into AWS Redshift and perform transformation.
 Implemented/developed new features for Plone web site, such as Single Sign-On with CAS server/Java and continuous
Integration/Continuous Deployment (CI/CD) Pipeline using Jenkins.
 Worked on development of SQL and stored procedures on MY SQL and Designed and developed a horizontally scalable
APIs using Python Flask.
 Created an on-premises CI/CD solution using Jenkins and the Pipeline plugin which uses pipeline as code.
 Participated in requirement gathering and analysis phase of the project in documenting the business requirements by
conducting workshops/meetings with various business users.
 Maintain source file, cross reference table and output file location and parameters in YAML file for Kedro pipeline to
retrieve and upload data files.
 Hands on experience in migrating on premise data to Google Cloud Platform (GCP) using cloud native tools such as Big
Query, Cloud Data Proc, Google Cloud Storage, Composer, Pub/Sub & Cloud Dataflow.
 Developed pyspark programs and created the data frames and worked on transformations.
 Scheduled and ran Airflow DAG jobs to sync and update cross reference tables in Google Drive to AWS S3.
 Update stakeholders on the results of the conversion program in daily standup meetings and log status in JIRA.
Environment: Python, PyCharm, Hadoop, Django, Grafana, Kafka, Nvidia, Casandra, Flask, AWS S3, AWS Redshift, Pandas,
NumPy, Spark, GitHub, SQL, Jira, Agile and MacOS.

Citi Bank - Cincinnati, OH June 2018 – Aug2021


Sr. Python Developer
Responsibilities:

 Used Pandas API to put the data as time series and tabular format for east timestamp data manipulation and retrieval.
 Worked on Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter
data within the Oracle database.
 Created a user review data classification pipeline using Apache Spark and Apache Airflow.
 Created a data extraction pipeline for migrating user review data from PostgreSQL dB to AWS Redshift.
 Deployed AWS Lambda functions to ingest user purchase data daily and stage it in S3.
 Designed a REST API backend using Django to provide access to marketplace trends dataset, this was used by product
management team.
 Wrote and executed various MYSQL database queries from Python using Python-MySQL connector and MySQL dB
package.
 Written shell and build CI/CD pipelines for application and service delivery into Cloud Foundry via Jenkins - build and
release with GIT.
 Used AWS lambda to run servers without managing them and to trigger to run code by S3 and SNS.
 Setup storage and data analysis tools in Amazon Web Services (AWS) cloud computing infrastructure.
 Updated Python scripts to match training data with our database stored in AWS Cloud Search, so that we would be able to
assign each document a response label for further classification.
 Performed troubleshooting, fixed, and deployed many Python bug fixes of the two main applications that were a main
source of data for both customers and internal customer service team.
 Used Python scripts to update content in the database and manipulate files.
 Generated Python Django Forms to record data of online users.
 Used Python and Django creating graphics, XML processing, data exchange and business logic implementation.
 Wrote Python scripts that ran on AWS Lambda to transform and move large amounts of data in and out of AWS S3 &
DynamoDB.
 Developed a Tableau dashboard on top of the metric dataset to visualize mobile ads performance and report to leadership
team.
 Updated and manipulated content and files by using python scripts.
 Migrated key data pipeline components from MATLAB to Python, deployed using Lambda and used CloudWatch for
monitoring, reducing firmware maintenance costs for product.
 Implemented web applications in Flask and spring frameworks following MVC architecture.
 Added support for Amazon AWS S3 and RDS to host static/media files and the database into Amazon Cloud.
 Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, and JavaScript.
 Used Django configuration to manage URLs and application parameters.
 Built various graphs for business decision making using Python matplotlib library.
Environment: AWS, Python, API, PySpark, Django, Java, Oracle, Snowflake, Teradata, Tableau, Unix/Linux, Oracle/SQL & DB2,
Agile, Apache Airflow, EMR, JavaScript.

KinderCare – Portland, OR Jan 2016 – May 2018


Python Developer
Responsibilities:
 Analysed pre-existing predictive model developed by advanced analytics team and factors considered during model
development.
 Analysed metadata and processed data to get better insights of the data.
 Analyse and prepare data, identify the patterns on dataset by applying historical models.
 Designed and developed data management system using MySQL.
 Built application logic using Python.
 Angular.js is used to build efficient backend for client web application.
 Used Python to extract information from XML files.
 Worked on development of SQL and stored procedures on MYSQL.
 Designed and developed a horizontally scalable APIs using Python Flask.
 Used supervised machine learning techniques such as Logistics Regression and Decision Tree Classification.
 Worked on datasets containing large values which are structured and unstructured data.
 Performed Data cleaning process- Forward filling methods on dataset for handling missing values.
 Created initial data visualizations in tableau to provide basic insights of data to the project stakeholders.
 Performed extensive exploratory data analysis using Teradata to improve the quality of the datasets.
 Experienced in various Python libraries like Pandas, One dimensional NumPy and Two dimensional NumPy.
 Worked for the Analytics team, to update the regular reports and providing solutions.
 Creating visualizations for the data extracted with the help of Tableau.
 Identifying patterns and meaningful insights from data by analysing it.
 Twisting SQL queries for improving performances.
 Performing Data Modelling in Tableau Desktop.

Harvard Pilgrim Healthcare - Boston, MA Sep 2014 – Nov 2015


Python Developer
Responsibilities:
 Writing Python scripts to parse XML documents as well as JSON based REST Web services and load the data in
database.
 data and for building the pipeline.
 Experienced in the conversion of unstructured data in the format of CSV, and JSON into structured data in parquet form
and saved in AWS s3 buckets.
 Experienced in the creation of Data zones that included AWS Glue databases and AWS Glue catalogue that help in giving
easy access to business and Tableau connection setup.
 Performed SQL optimization for the creation of AWS Views in Glue DB.
 Writing ORM’s for generating complex SQL queries and building reusable code and libraries in Python for future use.
 Working closely with software developers and debug software and system problems.
 Profiling Python code for optimization and memory management and implementing multithreading functionality.
 Involved in creating stored procedures that get the data and help analysts to spot the trends.
 Designed and developed the server module and resolved issues and responsible for its enhancements.
 Worked on Django ORM module for signing complex queries.
 Used WPF to create user interfaces for Windows operating system.
 Implemented business logic in Python to prevent, detect and claim duplicate payments Rewrite existing Python/Django
modules to deliver certain format of data.
 Used Django Database APIs to access database objects.
 Developed tools using Python, Shell scripting, XML to automate some of the menial tasks.
 Developed scalable and effective applications to mane technology trade-offs.
 Maintained web servers and platforms in cloud with collaborations with outside vendors. Used GitHub for version control.
 Performed troubleshooting, fixed, and deployed many Python bug fixes of the two main applications that were a main
source of data for both customers and internal customer service team.
 Implemented automation using its Python API for test case scenario.

Environment: Python, Oracle, JSON, XML, Django, API, SQL, REST, AWS.

EDUCATION:

2009-MS Computer Science, USA.

You might also like