Nataraj CV 12+ - New

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 12

HTC Global Services, Inc..

3270 West Big Beaver Road, Troy MI 48084


248.786.2500

Nataraj Virupaksham
9620203536

Nataraj Virupaksham has around 11+ years of Information Technology experience


expertise in Datawarehouse development projects involved in Analysis, Design,
Development of various business applications, different platforms using Informatica
Powercenter 10.x/9.x/8.x , Informatica Cloud (IICS) and Informatica ILM 6.2
Major Strengths:

 Experience Extensive experience delivering Data warehousing implementations,


Data migration and ETL processes to integrate data across multiple sources using
Informatica PowerCenter and Informatica Cloud Services
 Design, Development, Testing and Implementation of ETL processes using
Informatica IICS
 Experience in IICS Application Integration components like Processes, Service
Connectors, and process objects
 Developed Cloud mappings to extract the data for different regions (APAC, UK
and America).
 Knowledge on data integration, application integration, data quality and data
management using Informatica IICS
 Expertise in using methodologies for data extraction, transformation, and loading
data using various transformations like Expression, Router, Filter, Lookup, Update
Strategy, Union, and Aggregator.
 Extensive experience in developing Informatica Cloud Mappings, Data replication
and task creation.
 Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter
servers by migrated the code to Informatica Cloud Services.
 Experience in creating Mappings and Mapplets using various Connected and
Unconnected transformations like Source Qualifier, Aggregator, Expression,
Lookup, Filter, Joiner, Union, Router, Rank, Sequence Generator and Update
Strategy transformations.
 Expertise in implementing complex business rules by creating Reusable
Transformations, Mapplets, Sessions and Worklets and made use of the Shared
Folder concept using shortcuts wherever possible to avoid redundancy.
 Experience in creating, executing and monitoring Workflow’s with Workflow
Manager and Workflow Monitor.
 Expertise in creating complex mappings / sessions / Worklets/ Workflows in
Informatica Powecenter 9.5.1
 Expertise in applying Teradata utilities like FastLoad and MultiLoad .
 Involved in Deployment activities to deploy multiple informatica object from DEV
to QA and that to PROD
 Experience on Live arching and Application Retirement
 Very good experience in Installing and configuring Informatica ILM and FAS 6.2
 Experience in creating users, source and target connections, and security groups.
 Experience in creating, scheduling and monitoring ILM Jobs.
 Experience in EDM (Enterprise Data Manager) .
 Hands on Experience in creating archiving databases, files and Restoring the
archived database and files
HTC Global Services Inc.

 Strong experience in IDV (Informatica Data Vault)(FAS).


 Hands on experience in Informatica ILM administration.
 Worked on Data discovery Portal using Browse data, Search file Archive options.
Very good experience in querying the Data vault using Sql Worksheet.
 Data Archive Workflow for custom applications and packaged applications
 Experienced in integrating various data sources like Oracle, SQL Server, Fixed
Width delimited Flat Files.
 Strong experience in Data visualization reports available in Informatica ILM 6.2
 Experience in development Projects using Informatica PowerCenter 10.x/9.x
 Hands on experience on Data warehousing ETL tool Informatica and fair
knowledge on methodologies & concepts
 Worked with SAP sources.
 Worked on SAP sources and Pulled data from different SAP tables using Generate
BCI mappings available in Informatica PowerCenter.
 Changed the metadata of the SAP tables in Informatica by using the option
Generate and Install SAP R/3 Code.
 Worked exclusively on implementing the types of Slowly changing dimensions
(SCDs)- Type I II and III indifferent mappings as per the requirements.
 Experience in both dimensional and relational modeling concepts like Star-
Schema modeling, Snow Flaking.
 Experience in integration of various data Sources like Oracle, DB2, SQL Server
and flat files into the Staging area.
 Experience in creation and Review of Functional Design Documents and Technical
requirements, LLD and Run books.
 Having experience in SQL and PL/SQL Programming.
 Experience in UNIX.
 Expertise in scheduling tools like Control M, Autosys and DAC to schedule
informatica jobs.
 Experience in IOD (Informatica on Demand) / Informatica Cloud.
 Experience in archiving and decommissioning applications using ILM (Information
Lifecycle management) 6.2 tool.
 Having experience in both Database and file archiving.
 Organized Corporate External Training on Informatica ILM.
 Experience in testing source and target data by using Informatica DVO (Data
Verification Object) tool.
Education
 Bachelor in in Computer Science from Sri Krishna Devaraya University, 2002-
2005

<Natraj Virupaksham> 2
HTC Global Services Inc.

Technical Skill

Operating
Windows XP/2000/9X, NT, Unix.
System:
Informatica Informatica 10.x/9.x/8.x, Informatica Cloud (IICS), Informatica Life
Tools: Cycle Mgmt (ILM) 6.2, IDV 6.2(FAS), Control M , Putty , Autosys
Database: Oracle 11g, Teradata MS-SQL Server, MYSQL , Greenplum , DB2, MS-
Access.

Professional Experience:

Employer : HTC Global Services (Bangalore)


Title : Technical Lead.
Date of Employment : Jan 2021 to Till date.

Employer : DELL Technologies (Bangalore)


Title : Technical Lead.
Date of Employment : Apr 2019 to Jan 2021

Employer : Capgemini India Pvt Ltd (Bangalore)


Title : Senior Informatica Consultant
Date of Employment : Sep 2014 to Apr 2019

Employer : Step 2 technologies Pvt Ltd (Hyderabad)


Title : Informatica Consultant.
Date of Employment : Jan 2014 to Aug 2014

Employer : HCL Technologies (Bangalore)


Title : Informatica Developer.
Date of Employment : Apr 2011 to Jan 2014

<Natraj Virupaksham> 3
HTC Global Services Inc.

Project Information
Name : AMFAM PROD SUPPORT
Client : AMFAM
Role : Senior Technical Lead
Platform : UNIX
Tools : Informatica 10.2.0, Autosys,
RDBMS : Oracle, Greenplum.
Duration : JAN-2021 to till date

Description:
American Family Insurance (AMFAM) is a private mutual company that focuses on
Property, Causality, and auto insurance, and also offers Commercial Insurance , life ,health and
homeowners coverage as week as investment and retirement-planning products.
In this production support and maintenance project, I have mainly involved in resolving tickets
also worked on development of code changes as per the service requests.

Responsibilities:

• Expertise in Analysis, Design, Development, Implementation, Testing and Support of Data


Warehousing and Data Integration Solutions using Informatica PowerCenter.
• Design, Development, Testing and Implementation of ETL processes using Informatica
Cloud
• Worked on Data Integration and Transformations using Informatica IICS.
• Worked on Different IICS Connectors.
• Developed ETL programs using Informatica to implement the business requirements.
• Communicated with Business customers to discuss the issues and requirements.
• Performance tuning was done at the functional level and map level. Used relational SQL
wherever possible to minimize the data transfer over the network.
• Extensively Implemented Expression, Aggregate, Filter, Join, Normalizer, Source
qualifier, Sorter, Router, Transaction control, Rank, Union, Expression, connected and
unconnected Lookup, sequence generator, Update Strategy, transformations in
INFORMATICA keeping performance issues in implementing the above. Pushing code to
database level if necessary, to have minimal impact on loading sessions.
• Created Reusable expressions, Mappings in Shared Folders for Other Developers to reuse
them accordingly. Had Full Knowledge of END TO END Data Movement for Coding Reusable
Objects whenever necessary to be used widely across folders.
• Effectively used Informatica parameter files for defining mapping variables, workflow
variables, FTP connections and relational connections.
• Involved in enhancements and maintenance activities of the data warehouse including
tuning for code enhancements.
• Thoroughly followed standards, best practices throughout ETL process and Naming
Conventions for various objects in Informatica (Transformations, sessions, maps, workflow
names, log files, bad files, input, variable, output ports).
• Effectively worked in Informatica version-based environment and used deployment
groups to migrate the objects.
• Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.

<Natraj Virupaksham> 4
HTC Global Services Inc.

• Designed workflows with many sessions with decision, assignment task, event wait, and
event raise tasks, used Control-M to schedule jobs.
• Reviewed and analyzed functional requirements, mapping documents, problem solving
and trouble shooting.
• Performed unit testing at various levels of the ETL and actively involved in team code
reviews.
• Identified problems in existing production data and developed one-time scripts to correct
them.
• Fixed the invalid mappings and troubleshoot the technical problems of the database.
• Experience in applying Teradata utilities like FastLoad and MultiLoad

Project Information
Name : EBIA - SAFT
Client : Dell Internal
Role : Technical Lead
Platform : UNIX
Tools : Informatica 10.2.0, IICS, Control M, Nexus
RDBMS : Oracle, Teradata.
Duration : 01-May-2019 TO DEC 2020

Description:
Dell Technologies is the leader in digital transformation, providing digital technology
solutions, products, and services to drive business success. We have a system called SAFT
(Standard Audit File For Tax), in which we have several data flows like LZ , CDL and RDL.
We pull the data from Source system which is in Oracle and we generate the LRF file from source
then transform the data from LRF File to Inc Layer then to LZ, CDL and then to RDL layer.

Responsibilities:

• Expertise in Analysis, Design, Development, Implementation, Testing and Support of Data


Warehousing and Data Integration Solutions using Informatica PowerCenter.
• Design, Development, Testing and Implementation of ETL processes using Informatica
Cloud
• Worked on Data Integration and Transformations using Informatica IICS.
• Worked on Different IICS Connectors.
• Developed ETL programs using Informatica to implement the business requirements.
• Communicated with Business customers to discuss the issues and requirements.
• Performance tuning was done at the functional level and map level. Used relational SQL
wherever possible to minimize the data transfer over the network.
• Extensively Implemented Expression, Aggregate, Filter, Join, Normalizer, Source
qualifier, Sorter, Router, Transaction control, Rank, Union, Expression, connected and
unconnected Lookup, sequence generator, Update Strategy, transformations in

<Natraj Virupaksham> 5
HTC Global Services Inc.

INFORMATICA keeping performance issues in implementing the above. Pushing code to


database level if necessary, to have minimal impact on loading sessions.
• Created Reusable expressions, Mappings in Shared Folders for Other Developers to reuse
them accordingly. Had Full Knowledge of END TO END Data Movement for Coding Reusable
Objects whenever necessary to be used widely across folders.
• Effectively used Informatica parameter files for defining mapping variables, workflow
variables, FTP connections and relational connections.
• Involved in enhancements and maintenance activities of the data warehouse including
tuning for code enhancements.
• Thoroughly followed standards, best practices throughout ETL process and Naming
Conventions for various objects in Informatica (Transformations, sessions, maps, workflow
names, log files, bad files, input, variable, output ports).
• Effectively worked in Informatica version-based environment and used deployment
groups to migrate the objects.
• Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating
transformations.
• Designed workflows with many sessions with decision, assignment task, event wait, and
event raise tasks, used Control-M to schedule jobs.
• Reviewed and analyzed functional requirements, mapping documents, problem solving
and trouble shooting.
• Performed unit testing at various levels of the ETL and actively involved in team code
reviews.
• Identified problems in existing production data and developed one-time scripts to correct
them.
• Fixed the invalid mappings and troubleshoot the technical problems of the database.
• Experience in applying Teradata utilities like FastLoad and MultiLoad

Project Information
Name : SLB ETL DEV
Client : Sclumbrger Ltd, Houston, United States
Role : Informatica Senior Developer
Platform : UNIX
Tools : Informatica 10, DVO, Putty, SQL server, SAP,
RDBMS : Oracle, Teradata.
Duration : 05-Nov-2017 to 30-Apr-2019

Description:
Schlumberger Limited is the world's largest oilfield services company. Schlumberger
employs approximately 95,000 people representing more than 140 nationalities working in more than
85 countries. In SLB ETL , We have few modules called Sales Oder, Purchase Order, General
Ledgers, Demantra and ETC. In this case we have pulled the data from different SAP tables, SQL
server tables and also Seibel tables and move the data to Oracle database according to the
Business rules provided by Client..
Responsibilities:

<Natraj Virupaksham> 6
HTC Global Services Inc.

• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections,
• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections
• Worked on SAP sources and Pulled data from different SAP tables using Generate BCI
mappings available in Informatica Powercenter.
• Changed the metadata of the SAP tables in Informatica by using the option Generate and
Install SAP R/3 Code.
• Pulled data from legacy Seibel database and also SQL server database.
• Created mappings using the transformations such as the Source qualifier, Lookup,
Aggregator, Expression, Router, Filter, Rank, Sequence Generator, and Update
Strategy.
• Implemented complex business rules by creating Reusable Transformations, Mapplets,
Sessions and Worklets and made use of the Shared Folder concept using shortcuts
wherever possible to avoid redundancy.
• Experience in creating, executing and monitoring Workflow’s with Workflow Manager and
Workflow Monitor
• Experience in handing complex workflows which includes n number sessions and worklets.
• Experience in applying Teradata utilities like FastLoad and MultiLoad .
• Created parameter files for respective sessions and workflows.
• Worked on different tasks like Assignment, email , control and decision task available in
workflows manager, Informatica Powercenter
• Involved in Deployment activities to deploy multiple informatica object from DEV to QA and
that to PROD
• Created deployment groups in Informatica to record deployment objects.
• Worked exclusively on implementing the types of Slowly changing dimensions (SCDs)-
Type I II and III indifferent mappings as per the requirements..
• Involved in documentation part like reviewing and creating LLDs, HLDS for all systems used
in SLB project.
• Strongly involved in daily calls and onshore offshore handshakes to understand the
business requirements from onsite.
• Involved in integration of various data Sources like SAP, SQL Server, Oracle and flat files
into the Staging area.
• Strongly involved in monitoring, scheduling and holding the informatica jobs using
Autosys scheduling tool.
• Experience in understating UNIX scripts used for informatica scheduling.
• Created documents for autosys and other project specific documents.

Project Information
Name : CSS ODS Staging
Client : Farmers Insurance
Role : Informatica Powercenter Developer
Platform : UNIX, Windows
Tools : Informatica 9.5.1, DVO, Putty , Autosys
RDBMS : DB2
Duration : 20-Apr-2015 to 01-Nov-2017

Description:

<Natraj Virupaksham> 7
HTC Global Services Inc.

Farmers Insurance serve more than 10 million households with more than 20 million
individual policies across all 50 states through the efforts of over 50,000 exclusive and independent
agents and nearly 24,000 employees . CSS ODS Staging is mainly to retrieve the insurance claim
from three systems named PLA, MDM and LIFE. We have moved legacy data into PLA , MDM and
LIFE systems by applying business rules.
Responsibilities:

• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections,
• Expertise in strong development activities like creating ETL mappings using Informatica
Power center to extract the data from multiple sources like Flat files, DB2 , SFDC, Xml, csv ,
Delimited files transformed based on business requirements and loaded to Data Warehouse
• Created mappings using the transformations such as the Source qualifier, Lookup,
Aggregator, Expression, Router, Filter, Rank, Sequence Generator, and Update
Strategy.
• Implemented complex business rules by creating Reusable Transformations, Mapplets,
Sessions and Worklets and made use of the Shared Folder concept using shortcuts
wherever possible to avoid redundancy.
• Experience in creating, executing and monitoring Workflow’s with Workflow Manager and
Workflow Monitor
• Worked on different tasks like Assignment, email , control and decision task available in
workflows manager, Informatica Powercenter
• Involved in Deployment activities to deploy multiple informatica object from DEV to QA and
that to PROD
• Created deployment groups in Informatica to record deployment objects.
• Worked exclusively on implementing the types of Slowly changing dimensions (SCDs)-
Type I II and III indifferent mappings as per the requirements..
• Involved in documentation part like reviewing and creating LLDs, and Runbooks for all
systems used in ODS projects.
• Strongly involved in daily calls and onshore offshore handshakes to understand the
business requirements from onsite.
• Involved in integration of various data Sources like DB2, SFDC and flat files into the Staging
area.
• Strongly involved in monitoring, scheduling and holding the informatica jobs using
Autosys scheduling tool.
• Experience in understating UNIX scripts used for informatica scheduling.
• Created documents for autosys and other project specific documents.

<Natraj Virupaksham> 8
HTC Global Services Inc.

Project Information
Name : Purge Framework
Client : Farmers Insurance
Role : Informatica ILM Developer
Platform : UNIX, Windows
Tools : ILM 6.2, DVO, Putty
RDBMS : Oracle 10g, SQL Server
Duration : 01-Oct-2014 to 19-Apr-2015

Description:
Farmers Insurance serve more than 10 million households with more than 20 million
individual policies across all 50 states through the efforts of over 50,000 exclusive and independent
agents and nearly 24,000 employees . Purge framework is mainly to retrieve the insurance claim
from Data vault via Informatica Powercenter and generate the reports according to the business rules
provided by the client .After generate the reports , purge the legacy data.

Responsibilities:

• worked on Live arching and Application Retirement


• Experience in creating users, source and target connections, security groups.
• created archiving databases, files and Restoring the archived database and files.
• Strong experience in IDV (Informatica Data Vault)(FAS).
• Worked on Data discovery Portal using Browse data, Search file Archive options.
Created Source and target connections, Security groups and Users By using
Informatica ILM administration.
• Hands on Experience in creating archiving databases, files and Restoring the archived
database and files.
• Created accounts for users and provided appropriate roles for them in ILM tool.
• Created Source and Target connections by using ILM tool.
• Created Entities and loaded the appropriate tables and views and mined them through EDM
(Enterprise Data Manager).
• Created new retirement projects and archived the applications successfully by using ILM
Tool.
• Monitored the ILM jobs after successful run of each retirement project.
• Generated the row count validation report after successful completion of ILM jobs.
• Successfully archived tables/views into FAS (File Archive Service) .
• Tested Source and target data using Informatica DVO (Data Verification Object) Tool.
• Created users, folders, ODBC DSN connections, and imported source and target tables by
using Informatica Power center Designer.
• Created and modified ODBC.INI file FAS.
• Generated reports using Data visualization in ILM .
• Created and configured repository connections in DVO.
• Created and added table pairs, added tests, generated values tests and ran the table pair
tests in DVO
• Debugged failed table pair tests in DVO
• Generated the DVO reports as per the business requirements.
• Raised Service Requests for Informatica to resolve different complex ILM and DVO issues.

<Natraj Virupaksham> 9
HTC Global Services Inc.

Project Information
Name : Tenix Data Archiving
Client : Tenix Data Solutions
Role : Informatica ILM Developer
Platform : UNIX, Windows
Tools : InformaticaPowercenter9.5,InformationLifecycle
Management (ILM)6.2, DVO, Putty ,
RDBMS : Oracle 10g, SQL Server
Duration : 01-Feb-2014 to Aug 2014

Description:
Tenix is a privately owned Australian company involved in a range of infrastructure
maintenance and engineering products and services to the utility, transport, mining and industrial
sectors in Australia, New Zealand, the Pacific Islands, and the United States. Data Archival project is
mainly to archive data and by decommissioning legacy systems and automating data retention
according to rules. This project is mainly to archive obsolete data to enhance performance – and
reduce database size and administration costs. And also Support data retention rules by creating
separate archives based on varying data lifetimes

Responsibilities:

Created Source and target connections, Security groups and Users By using
Informatica ILM administration.
• Hands on Experience in creating archiving databases, files and Restoring the archived
database and files.
• Created accounts for users and provided appropriate roles for them in ILM tool.
• Monitored the ILM jobs after successful run of each retirement project.
• Generated the row count validation report after successful completion of ILM jobs.
• Successfully archived tables/views into FAS (File Archive Service) .
• Tested Source and target data using Informatica DVO (Data Verification Object) Tool.
• Created users, folders, ODBC DSN connections, and imported source and target tables by
using Informatica Power center Designer.
• Created Source and Target connections by using ILM tool.
• Created Entities and loaded the appropriate tables and views and mined them through EDM
(Enterprise Data Manager).
• Created new retirement projects and archived the applications successfully by using ILM
Tool.
• Created and modified ODBC.INI file FAS.
• worked on Live arching and Application Retirement
• Experience in creating users, source and target connections, security groups.
• created archiving databases, files and Restoring the archived database and files.
• Strong experience in IDV (Informatica Data Vault)(FAS).
• Worked on Data discovery Portal using Browse data, Search file Archive options .
• Generated reports using Data visualization in ILM .
• Created and configured repository connections in DVO.
• Created and added table pairs, added tests, generated values tests and ran the table pair
tests in DVO
• Debugged failed table pair tests in DVO
• Generated the DVO reports as per the business requirements.

<Natraj Virupaksham> 10
HTC Global Services Inc.

• Raised Service Requests for Informatica to resolve different complex ILM and DVO issues.

Project Information
Name : DB Feeds
Client : Dautsche Bank
Role : Informatica Developer
Platform : UNIX, Windows
Tools : Informatica 8.6 / Informatica Cloud.
RDBMS : Oracle 10g
Other Tools : Dbsymphony, GCM, TOAD, Control M
Duration : March 2012 to Dec 2013

Description:
Dbfeeds is a middle layer application which receives the files from upstream, does
transformation using ETL and delivers the files to the downstream with in an SLA (service level
agreement). The objective of the DBFeeds project is to provide a “feeds” management service for
both the data originators and consumers.
DBFeeds will connect to Upstream and downstream systems through DB Adaptor tool. DB Adaptor is
a common infrastructure and format for File and message delivery globally across DB.
Responsibilities:

• Understanding the business Requirements and implement the same into a functional
database/ETL mapping design.
• Created ETL mappings using Informatica Power center to extract the data from multiple
sources like Flat files, Oracle , Xml, csv, Delimited files transformed based on business
requirements and loaded to Data Warehouse
• The data is transformed, validated, and loaded into the Data-warehouse using Power
Center.
• Created mappings using the transformations such as the Source qualifier, Aggregator,
Expression, Router, Filter, Rank, Sequence Generator, and Update Strategy.
• Worked on connected Lookup transformation to look up values in a table.
• Creating reusable transformations, Mapplets and mappings using Informatica.
• Implemented slowly changing strategy Type II to manage the data in the dimension tables.
• Redesigned some of the existing mappings in the system to meet new functionality.
• Checked and tuned the performance of Informatica Mappings.
• Used Debugger to check the data flow in the mapping and made appropriate changes in the
mappings to generate the required results.
• Best practices followed during ETL code & involved in performance tuning process from
Database and ETL code level.
• Given Training to team members on Informatica Power Center.

Project Information
Name : Data Archival Project
Client : Lloyds Banking Group (LBG)
Role : ILM Developer

<Natraj Virupaksham> 11
HTC Global Services Inc.

Platform : UNIX, Windows


Tools : Informatica Lifecycle Management (ILM), DVO, Putty
RDBMS : Oracle 10g
Duration : Apr 2011 – Feb 2012

Description:
Lloyds Banking Group (LBG) is a financial services group with more than 30 million
customers in the UK, and a foothold in every community. It provides everyday banking through a wide
range of accounts to our UK customers.
Data Archival project is mainly to archive data and by decommissioning legacy systems and
automating data retention according to rules. This project is mainly to archive obsolete data to
enhance performance – and reduce database size and administration costs. And also Support data
retention rules by creating separate archives based on varying data lifetimes

Responsibilities:
• Understanding the Business Requirements and implement the same by using ILM
(Informatica Lifecycle Management) tool.
• Created accounts for users and provided appropriate roles for them in ILM tool.
• Created Source and Target connections by using ILM tool.
• Created Entities and loaded the appropriate tables and views and mined them through EDM
(Enterprise Data Manager).
• Created new retirement projects and archived the applications successfully by using ILM
Tool.
• Monitored the ILM jobs after successful run of each retirement project.
• Generated the row count validation report after successful completion of ILM jobs.
• Successfully archived tables/views into FAS (File Archive Service) .
• Tested Source and target data using Informatica DVO (Data Verification Object) Tool.
• Created users, folders, ODBC DSN connections, and imported source and target tables by
using Informatica Power center Designer.
• Created and modified ODBC.INI file FAS.
• Created and configured repository connections in DVO.
• Created and added table pairs, added tests, generated values tests and ran the table pair
tests in DVO
• Debugged failed table pair tests in DVO
• Generated the DVO reports as per the business requirements.
• Raised Service Requests for Informatica to resolve different complex ILM and DVO issues.

<Natraj Virupaksham> 12

You might also like