Yogananda Reddy Nusi: Sensitivity: Internal & Restricted

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Yogananda Reddy Nusi Email: yogananda9.reddy@gmail.

com
Bangalore, India Ph: 9502734436

Diligent, Forward-thinking professional with 14+ years of experience in Banking and Healthcare domain.
Versatile leader who thrives in dynamic, challenging and fast-paced professional environments.

--------------------------------------------------------------------------------------------------------------------------------------------

Experience Summary

 Working as Architect on Big Data and cloud Technologies for Data projects – Unified Datahub,
Enterprise Data Warehousing and Data Factory Projects with onshore experience of 4+ years.

 14+ years of extensive experience in Agile and SDLC projects. Strategically architect, design,
develop, and implement efficient information systems and operations systems in support of
enterprise. Software Development Life Cycle including phases - Requirement gathering, analysis,
design, development, testing, maintenance, and production support projects.

 Align the organization’s big data solutions with their client initiatives as requested. Utilize Big
Data technologies to design, develop, and evolve scalable and fault-tolerant distributed
components

 Working on Big Data Tools like HDFS, Sqoop, Hive, HBase, Spark, Scala, Elasticsearch, Spark
streaming, Kafka and Kibana. Hadoop frameworks – Cloudera, Hortonworks.

 Worked on Cloud Services like AWS Glue, Amazon S3, Amazon EMR, Amazon Lambda, Amazon
EC2, Amazon RDS, AWS Step functions

 Worked on ETL Tools like Ab Initio 3.32, Data stage 8.5, TRMC, Webeme. Certified in IBM Info
Sphere Data Stage v8.5 Certification.

Areas/Applications:

Business Domain Banking, Finance and Healthcare


Hadoop Tools Cloudera, Hortonworks Frameworks. HDFS, Hive, Hbase, Scala, Spark,
Sqoop, SparkSQL, Spark Streaming, Kafka, Airflow
Cloud Technologies AWS Glue, Amazon S3, Amazon EMR, Amazon Lamdba, Amazon EC2,
Amazon RDS, AWS Step functions
ETL technology Ab Initio GDE 3.3, Trillium V15, Data stage 8.5, UNIX Shell Scripting and
Teradata, TRMC, Webeme, Support Aid.
Mainframe technology COBOL, JCL, DB2, SAS, VSAM and Easytrieve

Analytics tools Mainframe SAS and Enterprise SAS


Search tools Elasticsearch and Kibana
Devops Tools Jenkins, Jira, SourceTree, Bitbucket, Confluence and Git

Sensitivity: Internal & Restricted


Work Experience Summary

Company Duration
IBM India Pvt Ltd July 2007 – Sep 2010
Tata Consultancy Services Sep 2010 – Dec 2018
Wipro Tech Ltd Jan 2019 – Nov 2019
Emids Technologies Pvt Ltd Nov 2019 – Till date

Career Profile

Project: 1
Project Title: Unified Datahub, Carecentrix
Position: Architect (Big Data & Cloud)
Business area: HealthCare
Organization: Emids Technologies Pvt Ltd
Location: Bangalore, India
Duration: Nov 2019 to till date

Description:
CareCentrix(“CCX”) is a leader in managing care to the home. Headquartered in Hartford, Conn.,
CareCentrix connects patients with the care they need at home, through a national network of over 7,400
credentialed provider locations, with customer care centers located across the United States.
CareCentrix’s extensive range of services provides support and coordination for patients and their families
in every step of the healing process, including home health, durable medical equipment (DME), home
infusion, sleep management and care management services, which ultimately improve care transitions
and reduce unnecessary readmissions and emergency room visits.
Unified Datahub(“UDH”) Project ingests data from all legacy applications of CCX and build a
datahub servicing the downstream – datahub orchestration, EPS , MC and Legacy systems. UDH is
metadata driven framework which get data from oracle and processes thru spark & Scala and sends
output to hbase and hive tables.

Hardware: Cloudera (CDH)


Languages: UNIX SHELL Script, Spark, Scala, Kibana, SparkSQL, Spark Streaming, Kafka
File System/Database: HDFS, Hive, Hbase,
Special Software: Git, Jenkins, SourceTree, Confluence, Bitbucket, Jira.

Responsibilities:
 Work on requirement gathering, project design and architecture workflow design
 Work in a fast-paced agile development environment and provide support in defining the scope
and sizing of work.
 Work with Data domains experts to understand business requirements and study existing
application landscape. Responsible for translating business requirements into technology
solutions. Conceptualize technical solutions and maximize benefit of IT systems investments.
 Work closely to gain organizational commitment for all systems and software plans, as well as
evaluate and guide the selection of technologies required to complete those plans.
 Coordinate with offshore development team(s) to identify priorities and update scope and delivery
schedule.

Sensitivity: Internal & Restricted


 Create data hydration/system integration strategies. Monitor system performance to detect and
resolve problems during deployment and support change management.
 Work with domain experts to put together a delivery plan with and stay on track
 Organize meetings with customers and ensure prompt resolution of gaps and roadblocks
 Responsible for the design and execution of abstractions and integration patterns (APIs) to solve
complex distributed computing problems.
 Working on Continuous Integration tools/DevOps tools – Jenkins, Jira, SourceTree, Bitbucket and
Git

Project: 2
Project Title: Cloud Persona, Huawei Technologies Ltd
Position: ETL/Big Data Architect
Business area: Telecom
Organization: Wipro Tech Ltd
Location: Bangalore, India
Duration: Jan 2019 to Nov 2019

Description:
Huawei is a leading telecom infrastructure and solutions company. Cloud Persona provides user
tags to wise marketing tools for consumer business. Process consumer data in Bigdata platform using
Hive, Scala and Spark and store finally in Hbase tables. Data from Hbase tables is finally imported to
Elastic Search and so data is accessible to users from UI portal.

Hardware: Fusion Insights


Languages: UNIX SHELL Script, Spark, Scala
File System/Database: HDFS, Hive, Hbase
Special Software: Git, Jenkins

Responsibilities:
 Requirement Analysis, Project Design and worked architecture workflow design
 Implemented complete solution in bigdata platform of data movement from hive to hbase and
then to elasticsearch. Finally moving data to Wisemarketing portal for marketing.
 Define logical and physical data model structures to store, integrate, govern, and maintain data
in a secure and efficient manner while maintaining accuracy of the data in the enterprise data
lake.
 Assist in strategy to bring the existing data models and their transformational logic from legacy
warehouses to a modern big data platform to support analytics, reporting, machine learning and
AI applications.
 Strong familiarity with data governance, data lineage, data processes, DML, and data architecture
control execution. Create and own Data Flow Diagrams for data movement.
 Develop prototypes and proof of concepts using multiple data-sources and big-data technologies.
 Experience working iteratively in a fast-paced agile environment.
 Working on Continuous Integration tools/DevOps tools – Jenkins and Git

Project: 3
Project Title: Key Bank – Enterprise Data Warehousing (EDW), Data Supply Chain (DSC)
Position: Senior Technical Lead and ETL Architect
Business area:Banking

Sensitivity: Internal & Restricted


Organization: Tata Consultancy Services (TCS)
Location: Key Bank, 4910, Tiedeman Road, Brooklyn, Ohio – 44144 & Bangalore, India
Duration: June 2013 to Dec 2018
Description:
KeyBank (NYSE: KEY) is a regional bank headquartered in Key Tower within Cleveland, Ohio's
Public Square. KeyBank National Association is a nationally chartered bank, regulated by the Office of the
Comptroller of the Currency, Department of the Treasury. Key companies provide investment
management, retail and commercial banking, consumer finance, and investment banking products and
services to individuals and companies throughout the United States and, for certain businesses,
internationally. Key's customer base spans retail, small business, corporate, and investment clients. The
company's businesses deliver their products and services through branches and offices; telephone
banking centers.

Key Initiatives
1. Data Supply Chain (DSC) - Real Estate Secured Servicing Transformation (RESST)
Data Supply Chain is the new enterprise solution for Key Bank including sourcing, consumptions,
Integrated Layer and DataMart’s. RESST Program is designed to transform current mortgage servicing
platform from Miser/FIS vendor to Black Knight Financial system (BKFS). Working on core integrations to
create extracts for all downstream consuming applications.
Data Files received from MSP/BKFS would be sent to Hadoop sourcing Layer (SRC) to create
external Hadoop tools for LOB users. HDFS Sourcing data would be used to create the consumption
extract for all downstream applications and Integrated Layer (IL). Data from IL Layer would be used to
build other DataMarts.

Hardware: Unix, Horton Works


Languages: Ab Initio 3.3, UNIX SHELL Script, Spark, Scala
File System/Database: VSAM, NAS, HDFS, DB2, Hive
Special Software: TRMC, XLR, PPM, HP Service Manager, SCLM
BKFS tools: MSP Director, Passport, Director Scripting

Responsibilities:
 Requirement Analysis, Project Design and worked architecture workflow design
 Worked and review with Enterprise Architecture team, Program Level and Business owners for
this program on the purposed design and solutions. Communicate with clients and Lines of
business on the new projects and requirements. Work on the project estimation.
 Give the walk the wall session to entire senior Key Bank leadership team on the Design solutions,
Implementation strategy
 Created workflow, reports, issue logs and dashboard for stakeholder on various processes
 Provide SOAi and Web based solution for operations tool for credit bureau reporting and Genesis
Dialler systems using director scripting
 Creates source to target mappings between legacy warehouses and the future state in the data
lake for various business domains.
 Map to information entities that can define how information should flow and be consumed by
various business functions and IT customers.

2. EDW (Enterprise Data Warehousing)


Enterprise Data Warehouse (EDW) is designed to be the centralized information repository of Key
Bank system. The scope of the project is to build the centralized History Data warehouse and Data marts
for subject areas like deposits, loans, core banking data and Marketing to have effective reporting to

Sensitivity: Internal & Restricted


power users. This project is a Data warehouse support and development project. DataMart’s included in
EDW were HSH(Householding), FIR (Fee Income Reporting), KFC (Key First Choice), SALT (State and Local
tax)
Key Initiatives:
1. Prepaid and Purchase Card
2. First Niagara Conversion
3. Key Merchant services conversion from Elavon to First Data
4. Key Relationship Rewards
5. Householding transformation from trilliumV7 to V15

Hardware: IBM Big Insights, Hadoop, Mainframe 390, Unix, and Linux
Languages: Ab Initio 3.3, Sqoop, Hive, UNIX SHELL Script, COBOL, JCL, IBM Mainframe,
File System/Database: HDFS File System, DB2, VSAM, NAS,
Special Software: TRMC, XLR, PPM, HP Service Manager, Service Now, SCLM.

Responsibilities
 Requirement Analysis, Project Design and worked architecture workflow design
 Work on Sqoop tools for one-time data requirements from other applications
 Provide customer combined and householding base solution for enterprise use
 Improved efficiency through automation or elimination
 Managed a team of 5 onshore and 15 offshore team.
 Communicate with clients and Lines of business on the new projects and requirements. Work on
the project estimates
 Build Ab Initio graphs and trillium process flow.
 Carryout necessary initiatives to meet customer deliverables, improve customer experience and
retain stuff
 Consistently meet and exceed all goals relating to service delivery
 Improved the development of support staff with cross-training, which increased productivity and
reduced staffing budget.
 Worked on the automation tools – Validation extension, Dependency Analysis
 Working on POC’s to use Continuous Integration tools/DevOps tools – Jenkins and Jira
 Establish “Best Practices” and plan for continuous improvement of processes.
 To gather and clarify the requirements from the Lines of business and Business Analyst.
Project: 4
Project Title: Bank of America (BofA) – Deposits Application
Position: Senior Developer
Business area: Banking
Organization: Tata Consultancy Services (TCS)
Location: Chennai, India.
Duration: Sep 2010 – June 2013
Description:
Bank of America Corporation is an American multinational banking and financial services
corporation headquartered in Charlotte, North Carolina. Bank of America aims at creating a single and
comprehensive view of the information about its people that includes customers, associates, and
prospects across the bank. In the current infrastructure there are over 200 account service systems, that
maintain account data and discrete customer information out of which only a very few systems exchange
customer information.

Extended Account Database application (XAD)


XAD is a Deposit Products Technology in Bank of America. It eventually holds all Deposit accounts
across the franchise within its DB2 Database and act as a processing engine to compute and generate
customer rewards, bonuses, and refunds. XAD is meant for reward calculation to Bank of America

Sensitivity: Internal & Restricted


customer enrolled to various deposit products. The XAD system offers a wide range of features and
functions to support Bank of America’s Consumer and Small Business Deposit products and services.

Transaction Repository (TR/RAP)


TR is part of Deposits Products Technology in Bank of America. This Application holds the
transaction data of past 13 months of the customers and responds to the online banking and many others
screen with the transactions. It gets Feed from posting application IMPACS and stores in IMS Database

Key Initiatives
1. XAD: Keep the change Modification
Bank of America gives incentive amount for the Debit transaction done. Durbin Amendment will
have significant impact on debit interchange revenue and causes significant changes to the current
payment’s ecosystem, including our ability to offer rewards to customer for using the debit card. By this
Project Bank is trying to compensate the revenue loss by eliminate the keep the change reward partially
to the customers

2. XAD: Northwest Transformation, California Transformation & Business Customer Solutions


The project aimed at transforming and integrating the Bank of America’s Legacy Northwest
platform (processed Idaho and Washington accounts) into Bank’s MODEL Platform, which is a standard
banking platform of the bank. This initiative successfully converted approximately six million NW
customers into Model’s. Now, the customers of Idaho and Washington enjoy all Model products and
services that have remarkably enhanced their banking experience.

These projects aim at converting Consumer and Small Business, Commercial customers and
associates doing business in California to those of MODEL. This initiative once successfully implemented
will provide California customer a standard and positive banking experience across the U.S and will have
Converted 15.634 million CA deposit accounts and corresponding customers to the Target Model Bank
deposit sales, servicing and accounting platforms.

3. TR: PICO Posting and Viewing


PICO – Post in Chronological Order is an initiative for changing the order of transaction displayed
in online banking and other application. Introduced 2 columns to identity the order – ETI (Enterprise
Transaction ID) and PICO Category Code

Hardware: IBM Mainframe 390


Languages: Data stage 8.5, JCL, COBOL, REXX, SQL
File System/Database: VSAM, DB2, IMS
Special Software: NDM, FILE-AID, IBM utilities, FILE-AID, CA7, CA7, Change man

Responsibilities:
 Project discussions to consolidate the status and requirements
 Perform system analysis, Design the project requirements and Create the Low-level design
 Perform application development and Unit testing
 Modification and enhancement of the application system
 Perform program construction / modification due to problem fixes and other enhancements
 Analysis of the System at Unit, Integration and System Test Level
 Review of code specifications for enhancements and checking and modifying the COBOL program
coding standards using ASA (Automated Standard Analyzer)

Sensitivity: Internal & Restricted


Project: 5
Project Title: Duns and Bradstreet-Risk Management System (RMS)
Position: Developer
Business area: Credit Rating (Financial)
Organization: IBM INDIA PVT LTD
Location: Bangalore, India
Duration: July 2007 to Sep 2010

Description:
D&B (NYSE: DNB), the leading provider of global business information, tools, and insight, has
enabled customers to Decide with Confidence for over 60 years.
RMS is the Risk Management Solution Sales segment of D&B. Risk Management System provides
Risk related information to the customer. Process varies if client needs data for Global, Canada and US
customers and based on data as like risk related data, adhoc data request, standard Product one like SBRI,
SBRPS,Risk assessment manager(RAM) ,eRAM

Hardware: IBM Mainframe 390


Languages: JCL, SAS, Easytrieve, REXX
File System/Database: VSAM
Special Software: NDM, FILE-AID, IBM utilities, Change man

Responsibilities:
 Project discussions to consolidate the status and requirements
 Perform system analysis, Design the project requirements and create the Low-level design
 Perform application development and Unit testing
 Modification and enhancement of the application system
 Adherence to the quality aspect and following the concept of continuous improvement to provide
quality products to the customer and Communication with client.
 Involved in statistical analysis and manipulation of datasets using SAS for the credit ratings.
 Planning of the night shifts for the team and maintaining the statistics of the global files
 Moderator of the code re-engineering and process improvement
Professional Certification:
 Certified in IBM Info Sphere Data Stage v8.5 Certification.

Education:
 Bachelor of technology from G Pulla Reddy Engg College, Sri Krishnadevaraya University, Kurnool

Personal Details:
Date of Birth: June 9, 1986
Sex: Male
Nationality: Indian
Marital Status: Married
Designation: Architect
Location: Bangalore, India

Sensitivity: Internal & Restricted

You might also like