Nataraj CV 12+ - New
Nataraj CV 12+ - New
Nataraj CV 12+ - New
Nataraj Virupaksham
9620203536
<Natraj Virupaksham> 2
HTC Global Services Inc.
Technical Skill
Operating
Windows XP/2000/9X, NT, Unix.
System:
Informatica Informatica 10.x/9.x/8.x, Informatica Cloud (IICS), Informatica Life
Tools: Cycle Mgmt (ILM) 6.2, IDV 6.2(FAS), Control M , Putty , Autosys
Database: Oracle 11g, Teradata MS-SQL Server, MYSQL , Greenplum , DB2, MS-
Access.
Professional Experience:
<Natraj Virupaksham> 3
HTC Global Services Inc.
Project Information
Name : AMFAM PROD SUPPORT
Client : AMFAM
Role : Senior Technical Lead
Platform : UNIX
Tools : Informatica 10.2.0, Autosys,
RDBMS : Oracle, Greenplum.
Duration : JAN-2021 to till date
Description:
American Family Insurance (AMFAM) is a private mutual company that focuses on
Property, Causality, and auto insurance, and also offers Commercial Insurance , life ,health and
homeowners coverage as week as investment and retirement-planning products.
In this production support and maintenance project, I have mainly involved in resolving tickets
also worked on development of code changes as per the service requests.
Responsibilities:
<Natraj Virupaksham> 4
HTC Global Services Inc.
• Designed workflows with many sessions with decision, assignment task, event wait, and
event raise tasks, used Control-M to schedule jobs.
• Reviewed and analyzed functional requirements, mapping documents, problem solving
and trouble shooting.
• Performed unit testing at various levels of the ETL and actively involved in team code
reviews.
• Identified problems in existing production data and developed one-time scripts to correct
them.
• Fixed the invalid mappings and troubleshoot the technical problems of the database.
• Experience in applying Teradata utilities like FastLoad and MultiLoad
Project Information
Name : EBIA - SAFT
Client : Dell Internal
Role : Technical Lead
Platform : UNIX
Tools : Informatica 10.2.0, IICS, Control M, Nexus
RDBMS : Oracle, Teradata.
Duration : 01-May-2019 TO DEC 2020
Description:
Dell Technologies is the leader in digital transformation, providing digital technology
solutions, products, and services to drive business success. We have a system called SAFT
(Standard Audit File For Tax), in which we have several data flows like LZ , CDL and RDL.
We pull the data from Source system which is in Oracle and we generate the LRF file from source
then transform the data from LRF File to Inc Layer then to LZ, CDL and then to RDL layer.
Responsibilities:
<Natraj Virupaksham> 5
HTC Global Services Inc.
Project Information
Name : SLB ETL DEV
Client : Sclumbrger Ltd, Houston, United States
Role : Informatica Senior Developer
Platform : UNIX
Tools : Informatica 10, DVO, Putty, SQL server, SAP,
RDBMS : Oracle, Teradata.
Duration : 05-Nov-2017 to 30-Apr-2019
Description:
Schlumberger Limited is the world's largest oilfield services company. Schlumberger
employs approximately 95,000 people representing more than 140 nationalities working in more than
85 countries. In SLB ETL , We have few modules called Sales Oder, Purchase Order, General
Ledgers, Demantra and ETC. In this case we have pulled the data from different SAP tables, SQL
server tables and also Seibel tables and move the data to Oracle database according to the
Business rules provided by Client..
Responsibilities:
<Natraj Virupaksham> 6
HTC Global Services Inc.
• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections,
• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections
• Worked on SAP sources and Pulled data from different SAP tables using Generate BCI
mappings available in Informatica Powercenter.
• Changed the metadata of the SAP tables in Informatica by using the option Generate and
Install SAP R/3 Code.
• Pulled data from legacy Seibel database and also SQL server database.
• Created mappings using the transformations such as the Source qualifier, Lookup,
Aggregator, Expression, Router, Filter, Rank, Sequence Generator, and Update
Strategy.
• Implemented complex business rules by creating Reusable Transformations, Mapplets,
Sessions and Worklets and made use of the Shared Folder concept using shortcuts
wherever possible to avoid redundancy.
• Experience in creating, executing and monitoring Workflow’s with Workflow Manager and
Workflow Monitor
• Experience in handing complex workflows which includes n number sessions and worklets.
• Experience in applying Teradata utilities like FastLoad and MultiLoad .
• Created parameter files for respective sessions and workflows.
• Worked on different tasks like Assignment, email , control and decision task available in
workflows manager, Informatica Powercenter
• Involved in Deployment activities to deploy multiple informatica object from DEV to QA and
that to PROD
• Created deployment groups in Informatica to record deployment objects.
• Worked exclusively on implementing the types of Slowly changing dimensions (SCDs)-
Type I II and III indifferent mappings as per the requirements..
• Involved in documentation part like reviewing and creating LLDs, HLDS for all systems used
in SLB project.
• Strongly involved in daily calls and onshore offshore handshakes to understand the
business requirements from onsite.
• Involved in integration of various data Sources like SAP, SQL Server, Oracle and flat files
into the Staging area.
• Strongly involved in monitoring, scheduling and holding the informatica jobs using
Autosys scheduling tool.
• Experience in understating UNIX scripts used for informatica scheduling.
• Created documents for autosys and other project specific documents.
Project Information
Name : CSS ODS Staging
Client : Farmers Insurance
Role : Informatica Powercenter Developer
Platform : UNIX, Windows
Tools : Informatica 9.5.1, DVO, Putty , Autosys
RDBMS : DB2
Duration : 20-Apr-2015 to 01-Nov-2017
Description:
<Natraj Virupaksham> 7
HTC Global Services Inc.
Farmers Insurance serve more than 10 million households with more than 20 million
individual policies across all 50 states through the efforts of over 50,000 exclusive and independent
agents and nearly 24,000 employees . CSS ODS Staging is mainly to retrieve the insurance claim
from three systems named PLA, MDM and LIFE. We have moved legacy data into PLA , MDM and
LIFE systems by applying business rules.
Responsibilities:
• Expertise in Understanding the business Requirements and implement the same into a
functional database/ETL mapping design Experience in creating users, source and target
connections,
• Expertise in strong development activities like creating ETL mappings using Informatica
Power center to extract the data from multiple sources like Flat files, DB2 , SFDC, Xml, csv ,
Delimited files transformed based on business requirements and loaded to Data Warehouse
• Created mappings using the transformations such as the Source qualifier, Lookup,
Aggregator, Expression, Router, Filter, Rank, Sequence Generator, and Update
Strategy.
• Implemented complex business rules by creating Reusable Transformations, Mapplets,
Sessions and Worklets and made use of the Shared Folder concept using shortcuts
wherever possible to avoid redundancy.
• Experience in creating, executing and monitoring Workflow’s with Workflow Manager and
Workflow Monitor
• Worked on different tasks like Assignment, email , control and decision task available in
workflows manager, Informatica Powercenter
• Involved in Deployment activities to deploy multiple informatica object from DEV to QA and
that to PROD
• Created deployment groups in Informatica to record deployment objects.
• Worked exclusively on implementing the types of Slowly changing dimensions (SCDs)-
Type I II and III indifferent mappings as per the requirements..
• Involved in documentation part like reviewing and creating LLDs, and Runbooks for all
systems used in ODS projects.
• Strongly involved in daily calls and onshore offshore handshakes to understand the
business requirements from onsite.
• Involved in integration of various data Sources like DB2, SFDC and flat files into the Staging
area.
• Strongly involved in monitoring, scheduling and holding the informatica jobs using
Autosys scheduling tool.
• Experience in understating UNIX scripts used for informatica scheduling.
• Created documents for autosys and other project specific documents.
<Natraj Virupaksham> 8
HTC Global Services Inc.
Project Information
Name : Purge Framework
Client : Farmers Insurance
Role : Informatica ILM Developer
Platform : UNIX, Windows
Tools : ILM 6.2, DVO, Putty
RDBMS : Oracle 10g, SQL Server
Duration : 01-Oct-2014 to 19-Apr-2015
Description:
Farmers Insurance serve more than 10 million households with more than 20 million
individual policies across all 50 states through the efforts of over 50,000 exclusive and independent
agents and nearly 24,000 employees . Purge framework is mainly to retrieve the insurance claim
from Data vault via Informatica Powercenter and generate the reports according to the business rules
provided by the client .After generate the reports , purge the legacy data.
Responsibilities:
<Natraj Virupaksham> 9
HTC Global Services Inc.
Project Information
Name : Tenix Data Archiving
Client : Tenix Data Solutions
Role : Informatica ILM Developer
Platform : UNIX, Windows
Tools : InformaticaPowercenter9.5,InformationLifecycle
Management (ILM)6.2, DVO, Putty ,
RDBMS : Oracle 10g, SQL Server
Duration : 01-Feb-2014 to Aug 2014
Description:
Tenix is a privately owned Australian company involved in a range of infrastructure
maintenance and engineering products and services to the utility, transport, mining and industrial
sectors in Australia, New Zealand, the Pacific Islands, and the United States. Data Archival project is
mainly to archive data and by decommissioning legacy systems and automating data retention
according to rules. This project is mainly to archive obsolete data to enhance performance – and
reduce database size and administration costs. And also Support data retention rules by creating
separate archives based on varying data lifetimes
Responsibilities:
Created Source and target connections, Security groups and Users By using
Informatica ILM administration.
• Hands on Experience in creating archiving databases, files and Restoring the archived
database and files.
• Created accounts for users and provided appropriate roles for them in ILM tool.
• Monitored the ILM jobs after successful run of each retirement project.
• Generated the row count validation report after successful completion of ILM jobs.
• Successfully archived tables/views into FAS (File Archive Service) .
• Tested Source and target data using Informatica DVO (Data Verification Object) Tool.
• Created users, folders, ODBC DSN connections, and imported source and target tables by
using Informatica Power center Designer.
• Created Source and Target connections by using ILM tool.
• Created Entities and loaded the appropriate tables and views and mined them through EDM
(Enterprise Data Manager).
• Created new retirement projects and archived the applications successfully by using ILM
Tool.
• Created and modified ODBC.INI file FAS.
• worked on Live arching and Application Retirement
• Experience in creating users, source and target connections, security groups.
• created archiving databases, files and Restoring the archived database and files.
• Strong experience in IDV (Informatica Data Vault)(FAS).
• Worked on Data discovery Portal using Browse data, Search file Archive options .
• Generated reports using Data visualization in ILM .
• Created and configured repository connections in DVO.
• Created and added table pairs, added tests, generated values tests and ran the table pair
tests in DVO
• Debugged failed table pair tests in DVO
• Generated the DVO reports as per the business requirements.
<Natraj Virupaksham> 10
HTC Global Services Inc.
• Raised Service Requests for Informatica to resolve different complex ILM and DVO issues.
Project Information
Name : DB Feeds
Client : Dautsche Bank
Role : Informatica Developer
Platform : UNIX, Windows
Tools : Informatica 8.6 / Informatica Cloud.
RDBMS : Oracle 10g
Other Tools : Dbsymphony, GCM, TOAD, Control M
Duration : March 2012 to Dec 2013
Description:
Dbfeeds is a middle layer application which receives the files from upstream, does
transformation using ETL and delivers the files to the downstream with in an SLA (service level
agreement). The objective of the DBFeeds project is to provide a “feeds” management service for
both the data originators and consumers.
DBFeeds will connect to Upstream and downstream systems through DB Adaptor tool. DB Adaptor is
a common infrastructure and format for File and message delivery globally across DB.
Responsibilities:
• Understanding the business Requirements and implement the same into a functional
database/ETL mapping design.
• Created ETL mappings using Informatica Power center to extract the data from multiple
sources like Flat files, Oracle , Xml, csv, Delimited files transformed based on business
requirements and loaded to Data Warehouse
• The data is transformed, validated, and loaded into the Data-warehouse using Power
Center.
• Created mappings using the transformations such as the Source qualifier, Aggregator,
Expression, Router, Filter, Rank, Sequence Generator, and Update Strategy.
• Worked on connected Lookup transformation to look up values in a table.
• Creating reusable transformations, Mapplets and mappings using Informatica.
• Implemented slowly changing strategy Type II to manage the data in the dimension tables.
• Redesigned some of the existing mappings in the system to meet new functionality.
• Checked and tuned the performance of Informatica Mappings.
• Used Debugger to check the data flow in the mapping and made appropriate changes in the
mappings to generate the required results.
• Best practices followed during ETL code & involved in performance tuning process from
Database and ETL code level.
• Given Training to team members on Informatica Power Center.
Project Information
Name : Data Archival Project
Client : Lloyds Banking Group (LBG)
Role : ILM Developer
<Natraj Virupaksham> 11
HTC Global Services Inc.
Description:
Lloyds Banking Group (LBG) is a financial services group with more than 30 million
customers in the UK, and a foothold in every community. It provides everyday banking through a wide
range of accounts to our UK customers.
Data Archival project is mainly to archive data and by decommissioning legacy systems and
automating data retention according to rules. This project is mainly to archive obsolete data to
enhance performance – and reduce database size and administration costs. And also Support data
retention rules by creating separate archives based on varying data lifetimes
Responsibilities:
• Understanding the Business Requirements and implement the same by using ILM
(Informatica Lifecycle Management) tool.
• Created accounts for users and provided appropriate roles for them in ILM tool.
• Created Source and Target connections by using ILM tool.
• Created Entities and loaded the appropriate tables and views and mined them through EDM
(Enterprise Data Manager).
• Created new retirement projects and archived the applications successfully by using ILM
Tool.
• Monitored the ILM jobs after successful run of each retirement project.
• Generated the row count validation report after successful completion of ILM jobs.
• Successfully archived tables/views into FAS (File Archive Service) .
• Tested Source and target data using Informatica DVO (Data Verification Object) Tool.
• Created users, folders, ODBC DSN connections, and imported source and target tables by
using Informatica Power center Designer.
• Created and modified ODBC.INI file FAS.
• Created and configured repository connections in DVO.
• Created and added table pairs, added tests, generated values tests and ran the table pair
tests in DVO
• Debugged failed table pair tests in DVO
• Generated the DVO reports as per the business requirements.
• Raised Service Requests for Informatica to resolve different complex ILM and DVO issues.
<Natraj Virupaksham> 12