Career Summary: Master of Science in Computers & Electrical Engineering, University of Missouri, KC.-2015

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 4

Manoj Kumar

Career Summary
I have over 9+ years of IT experience as a Software Developer in various phases of IT projects following the
SDLC process such as analysis, design, coding, testing and deployment. I have Data warehousing experience in
building and managing various Data Warehousing/Datamart. I am good at quickly grasping new concepts,
both technical and business related and utilize as needed and also worked extensively with offshore teams. I
am a positive team player and self –starter with result-oriented attitude. I also have excellent communication
skills, interpersonal skills and capable of managing multiple priorities.

Education

 Master of Science in Computers & Electrical Engineering,University of Missouri, KC.-2015


 Bachelor of Technology from JNTU, Andhra Pradesh, India.-2010

Technical Skills
Applications and Tools: Informatica Power Center 7.X/8.X/9.X/10.1, SSIS Packages, Visual Studio
Languages: C++, PHP, VB, HTML, XHTML, JavaScript, XML, AJAX, SQL, BTEQ
Databases: Oracle 9i/10g, MS Access, DB2, Teradata, MS SQL Server
Data Modeling: Dimensional Data Modeling, STAR Schema Modeling, Snow Flake Modeling,
FACT and Dimension Tables, Physical and Logical Data Modeling.
Operating Systems: MS Windows, UNIX, Linux
Reporting Tools: SSRS, Tableau
Scheduling Tools: Tidal, Control M, Informatica Scheduler
Agile Tools: Tortoise SVN, Redgate, JIRA, Confluence, Fish Eye, Crucible, Subversion, Bit
Bucket

Professional Experience
Geisinger, Rockville, MD November2018-Present
ETL Developer

Responsibilities:
 Design, develop, test, and implement software (database triggers and stored procedures) that works
within complex databases to provide functionality that can insert, update, delete and report on
millions of records efficiently (query optimization, parameterized queries) and accurately deploying
various use cases (ways in which data needs to be addressed). Design, develop, implement, and
support queries (database access algorithms) and support various Relational Databases like Oracle,
Teradata, DB2 and SQL Server and writing complex Queries using SQL, PL/SQL.
 Experience in working with ETL against large scale terabytes of data and databases.
 Created mappings using the transformations such as the Source qualifier, Aggregator, Expression,
lookup, Router, Filter, Sequence Generator, Update Strategy, Stored Procedure, Normalizer.
 Involved in performance tuning Informatica mappings with Normalizer, Joiner, Aggregator, Sequence
Generator, Router, Filter transformations.
 Used Workflow Manager for creating workflows, Worklets, emails, and command tasks.
 Performed Unit Testing and Integration Testing on the mappings.
 Created Stored Procedures, Triggers, Synonyms, and Indexes at Database level.
 Used Jira as Change management and Defect management for Audit purposes.
 Involved in High level research in data analysis and Predictive scoring model involving imbalanced
data.
 Worked on Conversion of legacy systems (older versions of applications) data into sql server using
store procs and ETL process.
 Experience in query optimization and performance tuning along with sql profiler execution plan and
troubleshooting the transaction log.
 Involved in doing data pulls from different PIDB databases for both identified and deidentified data
using sql server management studio.
 Worked on health care data under Research team where we get the intake forms from the physicians
and researches and based on the intake form we would be doing the data pulls as requested.
 Worked on data migration for new version of OMOP project and documented the data and table
discrepancies between the versions of OMOP.
 Constantly updated the Nutrition database with monthly input file using the sql scripts and worked
on Sql Server Reporting Service (SSRS) to produce monthly nutrition reports for the end users

Anthem, Norfolk, VA December2013-November2018


Senior Developer

Responsibilities:
 Involved in Informatica server installation and server upgrades. Used Informatica 9.6.1 and currently
working on 10.1 upgrade.
 Participated in Agile Methodology and Agile development also have taken Agile 101 training.
 Used JIRA, Confluence, Fish Eye, Crucible, Subversion and Redgate all these Agile tools for change control
tracking, change control documentation and for coding standards.
 Involved in creating the DataMart for pulling the data from oracle FACETS platform and loaded into SQL
server by applying some global rules. Also involved in working on bringing better performance tuning
techniques to improve the Datamart performance.
 Worked on Provider data systems under the PDX team where we produce the provider network file
submissions to different states to provide the network adequacy.
 Mentored new team members joining into the team and trained them on how the PNF’s work by doing
multiple KT sessions and giving knowledge on the standards of all the technical information needed.
 Worked on Standardized the TDD for the PDX team by bringing out all the inputs from the team members
and created technical design documents for all the markets related to each project.
 Experience in working with ETL against large scale terabytes of data and databases.
 Created complex Informatica mappings with extensive use of Aggregator, Union, and Update Strategy,
Filter, Router, Normalizer, Wsdl, Joiner and Sequence generator transformations.
 Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple
target tables.
 Created evaluation process, submission and response process by using mapplets, mappings, worklets and
workflows based on the detail design process flow.
 Created Pre and Post Session scripts for checking the file existence.
 Worked on performance tuning techniques to improve the performance on the DataMart by applying the
partitioning techniques.
 Rebuilt indexes on tables using various SQL server commands to reduce fragmentation.
 Performed performance tuning techniques at source, target, mapping also session level.
 Responsible for Unit testing and Integration testing of mappings, sessions, workflows and complete
process flow.
 Documented existing mappings as per standards and developed template for mapping specification
document.
 Worked on Technical design document, involved in code review process and involved in creating code
review checklist and came up with standards for code reviews document.
 Migrated code using CTU from DEV to QA and involved in packaging up the code into Tortoise Subversion
and created all the deployment scripts, deployment forms and BOIP forms for the PROD deployment.
 Used Tidal to schedule Informatica jobs and to apply dependencies to the workflows.
 Created Run books to handover to operations team in order to monitor tidal nightly loads.
 Involved in Production support activities and monitored scheduled jobs through Tidal.
 Worked with SQL queries, created store procedures for data transformation purpose.
 Worked on Sql Server Reporting Service (SSRS) to produce the error report to the PDQ (Provider Data
Quality) team.

Amerigroup, Virginia Beach, VA March 2012 – December 2013


Informatica Developer

Responsibilities:
 Reviewed and translated Business Requirement Document in to Technical Specifications Design (TSD) for
VA Claims track.
 Extensively worked on BTEQ scripts to load large volume of data in to Enterprise Data Warehouse
(EDW).
 Integrated data in to EDW by sourcing it from different sources like flat files, Oracle tables
 Extracted data using BTEQ scripting which includes Fast Export and Multi Load utilities
 Used Informatica 9.1 and 8.6.1 for ETL processes.
 Developed shell scripts to parameterize the date values for the incremental extracts
 Worked on database objects like Tables, Stored Procedures, Views, Triggers, Rules, Defaults, user defined
data types and functions.
 Created DTS packages to transform and manipulate data from wide range of sources
 Used Tidal to schedule Informatica jobs and to apply dependencies to the workflows.
 Worked on stored procedures to load data into relational tables.
 Worked on Sql Server Reporting Service (SSRS) to produce the annual call rate for the business users
from the store procedures.

DISCOUNT TIRES, Phoenix, AZ September 2011 – February 2012


Informatica Developer

Responsibilities:
 Work closely with various levels of individuals to coordinate and prioritize multiple projects.
Estimate, schedule and track ETL projects throughout SDLC.
 Provided day-to-day support for the Document Management System (DMS) ensuring system
stability and data integrity.
 Worked on change data capture (CDC) to capture the changed data by pulling the data from legacy
iSeries environment by using Informatica Power Exchange 9.0.1.
 Automated the jobs thru scheduling using Informatica Tivoli scheduler and UNIX scripts, which
runs every day by maintaining the data validations.
 Led and coordinated the testing and migration of our team’s 20 production Informatica 8.6 code
folders to Informatica 9.0.1. Documented and communicated issues and resolutions.
 Created deployment and back-out scripts to deploy the code from Development environment to QA and
QA to PROD.
 Worked on performance tuning on Informatica Mappings and Sessions
 Worked on developing slowly changing dimension (SCD) TYPE I and TYPE II mappings to load SCD tables.
 Worked on the backend side of customer product review rating on Discount tire site in pulling the data
into Informatica flow and migrating the data into SQL Server 2008 using Informatica PowerCenter 9.0.1.
 Worked on Unit Testing and created unit testing documents.

ABBOTT Laboratories, PA June 2011 – August 2011


Informatica Developer

Responsibilities:
 Created detail design document and source to target matrix document based on the requirement
specification document and business logics.
 Developed reusable mapplets and reusable transformations for standard business units to be used in
multiple mappings.
 Worked on parameter files, mapping and workflow variables.
 Defined Target load order plan for loading data into different target tables
 Debugged the issues in the Informatica code with the help of Informatica Debugger and worked on the
unit testing the code manually with test cases.
 Implemented performance tuning techniques by identifying and resolving the bottlenecks in sources,
targets, transformations, mappings and sessions to improve the performance.
 Worked on creating tables, views and indexes on Teradata 13.10 and also Inserted sample data into
database for testing purpose.
 Worked on Version control of code from development to test and Production Environments

TRIBIS INC, MI September 2010 – May 2011


Informatica Developer

Responsibilities:
 Analyzed the business requirements and suggested changes to the design accordingly.
 Created Informatica Mappings with different transformations including Source Qualifier, Lookup, Filter,
Expression, Aggregator, Look up Transformations and more.
 Creating work/shared folders, users, groups and managing permissions, privileges to groups and users
using the Informatica Repository Manager.
 Developed Mapplets, mappings and configured sessions.
 Created reusable transformations and Mapplets to use in multiple mappings.
 Worked Efficiently with Designer tools including Source Analyzer, Warehouse designer, Mapping and
Mapplets Designer, Transformations.
 Created PL/SQL Functions and Stored Procedures for the processing of data from Warehouse to Mart.
 Tested Mappings and fixed bugs using Debugger.
 Performed scheduling techniques with ETL jobs using scheduling tools.
 Used SQL tools like TOAD to run SQL queries and validate the data in Data warehouse.
 Extracted the data for specific branches and automated the process.
 Tested the Informatica Sessions in the Workflow Manager.
 Created PL/SQL procedures in Oracle 10g.
 Involved in the creation of workflow and planning the schedule for this workflow.
 Involved in setting up users through Supervisor and assigning privileges to these users to meet security
requirements.
 Executed Scripts, managed scheduled activities based on business requirements.

You might also like