Yogananda Reddy Nusi: Sensitivity: Internal & Restricted
Yogananda Reddy Nusi: Sensitivity: Internal & Restricted
Yogananda Reddy Nusi: Sensitivity: Internal & Restricted
com
Bangalore, India Ph: 9502734436
Diligent, Forward-thinking professional with 14+ years of experience in Banking and Healthcare domain.
Versatile leader who thrives in dynamic, challenging and fast-paced professional environments.
--------------------------------------------------------------------------------------------------------------------------------------------
Experience Summary
Working as Architect on Big Data and cloud Technologies for Data projects – Unified Datahub,
Enterprise Data Warehousing and Data Factory Projects with onshore experience of 4+ years.
14+ years of extensive experience in Agile and SDLC projects. Strategically architect, design,
develop, and implement efficient information systems and operations systems in support of
enterprise. Software Development Life Cycle including phases - Requirement gathering, analysis,
design, development, testing, maintenance, and production support projects.
Align the organization’s big data solutions with their client initiatives as requested. Utilize Big
Data technologies to design, develop, and evolve scalable and fault-tolerant distributed
components
Working on Big Data Tools like HDFS, Sqoop, Hive, HBase, Spark, Scala, Elasticsearch, Spark
streaming, Kafka and Kibana. Hadoop frameworks – Cloudera, Hortonworks.
Worked on Cloud Services like AWS Glue, Amazon S3, Amazon EMR, Amazon Lambda, Amazon
EC2, Amazon RDS, AWS Step functions
Worked on ETL Tools like Ab Initio 3.32, Data stage 8.5, TRMC, Webeme. Certified in IBM Info
Sphere Data Stage v8.5 Certification.
Areas/Applications:
Company Duration
IBM India Pvt Ltd July 2007 – Sep 2010
Tata Consultancy Services Sep 2010 – Dec 2018
Wipro Tech Ltd Jan 2019 – Nov 2019
Emids Technologies Pvt Ltd Nov 2019 – Till date
Career Profile
Project: 1
Project Title: Unified Datahub, Carecentrix
Position: Architect (Big Data & Cloud)
Business area: HealthCare
Organization: Emids Technologies Pvt Ltd
Location: Bangalore, India
Duration: Nov 2019 to till date
Description:
CareCentrix(“CCX”) is a leader in managing care to the home. Headquartered in Hartford, Conn.,
CareCentrix connects patients with the care they need at home, through a national network of over 7,400
credentialed provider locations, with customer care centers located across the United States.
CareCentrix’s extensive range of services provides support and coordination for patients and their families
in every step of the healing process, including home health, durable medical equipment (DME), home
infusion, sleep management and care management services, which ultimately improve care transitions
and reduce unnecessary readmissions and emergency room visits.
Unified Datahub(“UDH”) Project ingests data from all legacy applications of CCX and build a
datahub servicing the downstream – datahub orchestration, EPS , MC and Legacy systems. UDH is
metadata driven framework which get data from oracle and processes thru spark & Scala and sends
output to hbase and hive tables.
Responsibilities:
Work on requirement gathering, project design and architecture workflow design
Work in a fast-paced agile development environment and provide support in defining the scope
and sizing of work.
Work with Data domains experts to understand business requirements and study existing
application landscape. Responsible for translating business requirements into technology
solutions. Conceptualize technical solutions and maximize benefit of IT systems investments.
Work closely to gain organizational commitment for all systems and software plans, as well as
evaluate and guide the selection of technologies required to complete those plans.
Coordinate with offshore development team(s) to identify priorities and update scope and delivery
schedule.
Project: 2
Project Title: Cloud Persona, Huawei Technologies Ltd
Position: ETL/Big Data Architect
Business area: Telecom
Organization: Wipro Tech Ltd
Location: Bangalore, India
Duration: Jan 2019 to Nov 2019
Description:
Huawei is a leading telecom infrastructure and solutions company. Cloud Persona provides user
tags to wise marketing tools for consumer business. Process consumer data in Bigdata platform using
Hive, Scala and Spark and store finally in Hbase tables. Data from Hbase tables is finally imported to
Elastic Search and so data is accessible to users from UI portal.
Responsibilities:
Requirement Analysis, Project Design and worked architecture workflow design
Implemented complete solution in bigdata platform of data movement from hive to hbase and
then to elasticsearch. Finally moving data to Wisemarketing portal for marketing.
Define logical and physical data model structures to store, integrate, govern, and maintain data
in a secure and efficient manner while maintaining accuracy of the data in the enterprise data
lake.
Assist in strategy to bring the existing data models and their transformational logic from legacy
warehouses to a modern big data platform to support analytics, reporting, machine learning and
AI applications.
Strong familiarity with data governance, data lineage, data processes, DML, and data architecture
control execution. Create and own Data Flow Diagrams for data movement.
Develop prototypes and proof of concepts using multiple data-sources and big-data technologies.
Experience working iteratively in a fast-paced agile environment.
Working on Continuous Integration tools/DevOps tools – Jenkins and Git
Project: 3
Project Title: Key Bank – Enterprise Data Warehousing (EDW), Data Supply Chain (DSC)
Position: Senior Technical Lead and ETL Architect
Business area:Banking
Key Initiatives
1. Data Supply Chain (DSC) - Real Estate Secured Servicing Transformation (RESST)
Data Supply Chain is the new enterprise solution for Key Bank including sourcing, consumptions,
Integrated Layer and DataMart’s. RESST Program is designed to transform current mortgage servicing
platform from Miser/FIS vendor to Black Knight Financial system (BKFS). Working on core integrations to
create extracts for all downstream consuming applications.
Data Files received from MSP/BKFS would be sent to Hadoop sourcing Layer (SRC) to create
external Hadoop tools for LOB users. HDFS Sourcing data would be used to create the consumption
extract for all downstream applications and Integrated Layer (IL). Data from IL Layer would be used to
build other DataMarts.
Responsibilities:
Requirement Analysis, Project Design and worked architecture workflow design
Worked and review with Enterprise Architecture team, Program Level and Business owners for
this program on the purposed design and solutions. Communicate with clients and Lines of
business on the new projects and requirements. Work on the project estimation.
Give the walk the wall session to entire senior Key Bank leadership team on the Design solutions,
Implementation strategy
Created workflow, reports, issue logs and dashboard for stakeholder on various processes
Provide SOAi and Web based solution for operations tool for credit bureau reporting and Genesis
Dialler systems using director scripting
Creates source to target mappings between legacy warehouses and the future state in the data
lake for various business domains.
Map to information entities that can define how information should flow and be consumed by
various business functions and IT customers.
Hardware: IBM Big Insights, Hadoop, Mainframe 390, Unix, and Linux
Languages: Ab Initio 3.3, Sqoop, Hive, UNIX SHELL Script, COBOL, JCL, IBM Mainframe,
File System/Database: HDFS File System, DB2, VSAM, NAS,
Special Software: TRMC, XLR, PPM, HP Service Manager, Service Now, SCLM.
Responsibilities
Requirement Analysis, Project Design and worked architecture workflow design
Work on Sqoop tools for one-time data requirements from other applications
Provide customer combined and householding base solution for enterprise use
Improved efficiency through automation or elimination
Managed a team of 5 onshore and 15 offshore team.
Communicate with clients and Lines of business on the new projects and requirements. Work on
the project estimates
Build Ab Initio graphs and trillium process flow.
Carryout necessary initiatives to meet customer deliverables, improve customer experience and
retain stuff
Consistently meet and exceed all goals relating to service delivery
Improved the development of support staff with cross-training, which increased productivity and
reduced staffing budget.
Worked on the automation tools – Validation extension, Dependency Analysis
Working on POC’s to use Continuous Integration tools/DevOps tools – Jenkins and Jira
Establish “Best Practices” and plan for continuous improvement of processes.
To gather and clarify the requirements from the Lines of business and Business Analyst.
Project: 4
Project Title: Bank of America (BofA) – Deposits Application
Position: Senior Developer
Business area: Banking
Organization: Tata Consultancy Services (TCS)
Location: Chennai, India.
Duration: Sep 2010 – June 2013
Description:
Bank of America Corporation is an American multinational banking and financial services
corporation headquartered in Charlotte, North Carolina. Bank of America aims at creating a single and
comprehensive view of the information about its people that includes customers, associates, and
prospects across the bank. In the current infrastructure there are over 200 account service systems, that
maintain account data and discrete customer information out of which only a very few systems exchange
customer information.
Key Initiatives
1. XAD: Keep the change Modification
Bank of America gives incentive amount for the Debit transaction done. Durbin Amendment will
have significant impact on debit interchange revenue and causes significant changes to the current
payment’s ecosystem, including our ability to offer rewards to customer for using the debit card. By this
Project Bank is trying to compensate the revenue loss by eliminate the keep the change reward partially
to the customers
These projects aim at converting Consumer and Small Business, Commercial customers and
associates doing business in California to those of MODEL. This initiative once successfully implemented
will provide California customer a standard and positive banking experience across the U.S and will have
Converted 15.634 million CA deposit accounts and corresponding customers to the Target Model Bank
deposit sales, servicing and accounting platforms.
Responsibilities:
Project discussions to consolidate the status and requirements
Perform system analysis, Design the project requirements and Create the Low-level design
Perform application development and Unit testing
Modification and enhancement of the application system
Perform program construction / modification due to problem fixes and other enhancements
Analysis of the System at Unit, Integration and System Test Level
Review of code specifications for enhancements and checking and modifying the COBOL program
coding standards using ASA (Automated Standard Analyzer)
Description:
D&B (NYSE: DNB), the leading provider of global business information, tools, and insight, has
enabled customers to Decide with Confidence for over 60 years.
RMS is the Risk Management Solution Sales segment of D&B. Risk Management System provides
Risk related information to the customer. Process varies if client needs data for Global, Canada and US
customers and based on data as like risk related data, adhoc data request, standard Product one like SBRI,
SBRPS,Risk assessment manager(RAM) ,eRAM
Responsibilities:
Project discussions to consolidate the status and requirements
Perform system analysis, Design the project requirements and create the Low-level design
Perform application development and Unit testing
Modification and enhancement of the application system
Adherence to the quality aspect and following the concept of continuous improvement to provide
quality products to the customer and Communication with client.
Involved in statistical analysis and manipulation of datasets using SAS for the credit ratings.
Planning of the night shifts for the team and maintaining the statistics of the global files
Moderator of the code re-engineering and process improvement
Professional Certification:
Certified in IBM Info Sphere Data Stage v8.5 Certification.
Education:
Bachelor of technology from G Pulla Reddy Engg College, Sri Krishnadevaraya University, Kurnool
Personal Details:
Date of Birth: June 9, 1986
Sex: Male
Nationality: Indian
Marital Status: Married
Designation: Architect
Location: Bangalore, India