Raghavendra Krishnam
Mobile: +1-309-***-**** E-Mail: ***********.********@*****.***
PROFESSIONAL SNAPSHOT
A competent professional with 11 years of experience in Data warehousing, Software Development and Implementation of Informatica 10.2, Oracle 12C, Teradata 15.6,ESP and Unix scripting.
Implemented business solutions in Insurance, Banking, Health care, Communications Media and Technology.
Played diverse roles in data space spanning Data Engineering, Data Analytics, Data Architecture, Data Product Management, Business Intelligence and Data Modeling disciplines.
Implemented ETL Solutions in Compensations systems which includes Personal lines and commercial lines polices and majorly calculating commissions for Exclusive and Independent agents.
Dealing with various polices like Auto, Property, Fire, Life and Earth Quake from different source systems like Allied, Harleysville and THI. While calculating commissions for Exclusive and Independent agents implemented various complex ETL solutions.
Implemented ETL solutions in CMT which includes products/services renewal rate, attached rate, Coverage rate for Cisco products.
Involved in quick turn-around POC systems to demonstrate the functionality of various data warehouse / business intelligence environments.
Exposed to various functional areas of Data warehousing including Banking, Insurance, Telecom and Health Care.
Strong professional skilled in Performance Tuning, Requirements Analysis and Agile Methodologies.
Experience in gathering customer requirements and expertise in designing ETL code.
Expert Understanding Data Warehouse Architecture and involved in application support and maintenance.
Expertise in Data Analysis, Data Modeling, Design ETL code, testing, Implementation of projects and application support.
Extensively worked on Ruby Automation Testing environment and captured major and minor bugs during code development.
Used ETL process as a practice of Acquire, Standardize, Integrate, Calculate and Provide as a norm of standard extendable DIF Framework.
Extensive knowledge with Relational & Dimensional data modeling, Star & Snowflake schema, Fact & Dimension tables and process mapping using the top-down and bottom-up approach.
Extensively used Aggregator, Joiner, Router, Normalizer, look up, XML parser, Expression, Filter, Source Qualifier, Union and Java transformations to build ETL functionality.
Created Informatic reusable objects (Sessions, Mapplets, Worklets, Transformations)
Worked on Informatica scheduling tools like CA Workstation ESP Scheduler, Control -M to schedule the jobs to kicks off runs in a regular time interval.
Designed and implemented SCD1, SCD2 and SCD3.
Prepared Integrated Design Specification (IDS) and low-level mapping sheet from the requirement specified in Integrated Requirement Specification (IRS) and mapping sheet provided by users.
Presenting the prepared IDS document in Integrated Coaching Session (ICS) for design approval.
Experience in working from client location in United States of America.
Migrated the ETL Tool from Sagent to Informatica Power Centre.
Excellent Communication and interpersonal skills which makes my ability to work with people at all levels across all functions.
Worked on CDC(Change Data Capture) implementation for various requirements.
Expertise on Teradata utilities (TPT Load) and Bteq. scripts.
PROFICIENCY FORTE
Technical:
ETL Tools : Informatica Power Center 10.2, IBM Data Stage 8.5
and Oracle Warehouse Builder
Databases : Oracle 12C/11G/10G/9i, Teradata, DB2, Progress
Database Tools : Oracle SQL*Plus, TOAD, Teradata SQL Assistant 15.10, PL/SQL Developer
Reporting Tools : MicroStrategy and OBIEE
Data Quality : Informatica data quality (IDQ)
Scheduling Tools : ESP (CA Workstation), Control-M, Informatica Scheduler
Languages : SQL, PL/SQL
Testing Tools : HP ALM, HP Quality Center 12.21
ETL Code Migration Tools : UCD, Harvest and Kintana Package
Versioning Tools : GitHub and SVN
Data Visualization : Splunk
Functional:
Good knowledge in property/casualty insurance business.
Playing lead role to handle all integration requirements and functional requirements and created design documents on each requirement.
Conducting system study and coordinating with team members for System Design & Integration, Application Maintenance, etc.
Handling various technical aspects like software design, coding of modules, monitoring critical paths & taking appropriate actions.
Managing smooth implementation of projects at client location.
Extending post-implementation, application maintenance and technical support to the client.
Developing the projects by using agile methodology.
ORGANISATIONAL EXPERIENCE
Major Projects Executed
Project Title : Revenue Connection Enhancements and Application Support
Company : Accenture LLP
Client : Nationwide Insurance, Columbus, OH
Duration : July-2016 to till date
Environment: Informatica 10.2, Teradata 15.2, Oracle 12C, Unix, MicroStrategy
Scope : The current project deals with the implementation of ETL Solutions in Compensations systems which includes Personal lines and commercial lines polices and majorly calculating base commissions and Variable commissions for Exclusive and Independent agents. Dealing with various polices like Auto, Property, Fire, Life and Earth Quake from different source systems like Allied, Harleysville and THI. While calculating commissions for Exclusive and Independent agents implemented various complex ETL solutions.
Responsibilities:
I have maintained data integrity through a high performance ETL application which ran 24x7 processing hundreds of jobs to assist business functionality.
Introduced Operation Bullet Proof (OBP) in Agency Solutions IT initiative focused on preventing widespread impacts to our agent compensation. While other efforts addressed gaps in our Build and end-to-end compensation process, OBP takes a deeper dive into the data and flows executed to support the compensation plans.
Responsible for Revenue connection application up and running 24x7 and enhancements.
Implemented ETL Solutions in Compensations systems which includes Personal lines and commercial lines polices and majorly calculating commissions for Exclusive and Independent agents.
Dealing with various polices like Auto, Property, Fire, Life and Earth Quake from different source systems like Allied, Harleysville and THI. While calculating commissions for Exclusive and Independent agents implemented various complex ETL solutions.
Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations which are degrading the performance of the session.
Designed and developed ETL code using various technologies like informatica, Teradata, Perl and Unix.
Designed and developed informatica mappings using transformation like aggregator, lookup (connected & unconnected), union, update strategy, SQL, Joiner, Router, Filter, Sorter, Expression transformations etc.,
Played key role in Build to Run meetings, Design reviews and code reviews sessions to provide impact analysis to the build line.
Decommissioned various applications without disturbing the functionality of the application.
Implemented automated Splunk alerts to deviate the thresholds and cut down service now tickets to the run team.
Implemented Splunk dashboards to check data trend and it will helpful to identify the data discrepancies.
Implemented Automated P1 tickets process in case of source delays which saves time during non-business hours.
Tuning SQL Queries to overcome spool space errors and improve performance.
Managed the run team of 4 developers onsite and 9 in offshore.
Monitoring Production Jobs, analyzing and resolving the environmental issues by coordinating with several teams that include Informatica admin team, Teradata DBA Team and infrastructure teams.
Extensively used various Informatica components like parameters, variables and partitioning to gain performance.
Implemented Push down optimization, Partitioning techniques to improve the performance of the informatica session.
Implemented exchange partition to improve the performance.
Project Title : FDW Agent 2.0 Enhancements and Application Support
Company : Accenture LLP
Client : State Farm Insurance, Bloomington, IL
Duration : Feb-2016 to June-2016(Onshore) & Oct 2015 to Jan 2016(Offshore,
Bangalore)
Environment: Informatica 10.2, DB2, Unix, Cognos
Scope : FDW Agent application provides agents information. An Insurance Agent will
interact with the individuals, organizations to promotes policies and services.
Agent application captures different policies, benefits and Agent personal
information. FDW Agent Analytics application involves studying past historical
data to research potential trends to analyze the effects of polices growth and
benefits. Which will mainly help full to the analysts to introduce a new polices
or make changes to the existing policies.
Responsibilities:
I have maintained data integrity through a high performance ETL application which ran 24x7 processing hundreds of jobs to assist business functionality.
Responsible for FDW Agent 2.0 application up and running, analyzing and resolving the environmental issues by coordinating with several teams that include Informatica admin team, and Oracle DBA Team. Escalating at proper time and resolving the problems well within time.
Responsible for impact analysis and decommissioning ETL applications.
I focus on customer’s needs through high performance applications and providing quality data.
Develop a deep understanding of systems and processes to extract insights from existing data while identifying and recommending IT enhancements to improve data quality; develop strong partnerships with IT application owner and data management teams to align on a roadmap for continual improvement.
Lead an innovation agenda, continually seeking to leverage the best techniques to improve the performance.
Project Title : Cisco xRM CTB
Company : Accenture LLP, Bangalore
Client : Cisco Systems
Duration : May – 2013 to Sep-2015
Environment : Informatica 9x, Oracle and OBIEE
.
Scope : PMC 2.0 (Performance Metrics Central) provides partners with Performance Metrics and Operational indicators. It measures partner’s performance metrics in terms of their ability to attach, renew and provide services on Cisco products. It creates and publishes monthly metrics such as Attach Rate (AR), Renewal Rate (RR),
Conversion Rate (CR), Service Request (SR) and Return Material authorization (RMA) based on the Install Base (IB), Sales Order, Service Contract etc. extracted from EDW.
It is an internal and external application and the web-based reporting is available for both Cisco Partners and internal users. PMC 2.0 is built with technologies such as Informatica, OBIEE reporting and Oracle Database.
Responsibilities:
Conducted series of discussions with Business analysts to understand the requirements. This will improve potential value to client.
Identified requirements with functional analysts and designed the ETL process according to the business requirements.
I utilized an agile approach to meet with business partners, iteratively showcase deliverables, and implement the report.
Presenting the prepared TDS document with Senior IT Analysts for design approval.
Performed impact analysis of the business requirement changes.
After getting proper design approvals development of mappings using various transformation and work flows in Informatica 9.6 based on TSD.
Designed and developed complex mappings to handle complex business logic.
Designed and developed informatica mappings using transformation like aggregator, lookup (connected & unconnected), union, update strategy, SQL, Joiner, Router, Filter, Sorter, Java and Expression transformations etc.,
Prepared of standard documents for known issues (KEDB) and workaround procedures for unknown issues.
Used Unix scripts to perform sanity checks like data files availability.
Configured alerts to alert support resources and business users on failures, errors, completion of the jobs.
Extensively used various Informatica components like parameters, variables and partitioning to gain performance.
Worked with Memory management for the better throughput of sessions containing Lookup, Joiner, Sorter and Aggregator transformations.
Applied Informatica Performance and Tuning Logic and tuned the SQL statements by passing Optimizer Hints wherever applicable.
Created UNIX scripts to sFTP, archive files and to trigger Informatica workflows.
Provided knowledge transition to the production support team to understand the new functionality.
Develop a deep understanding of systems and processes to extract insights from existing data while identifying and recommending IT enhancements to improve data quality; develop strong partnerships with IT application owner and data management teams to align on a roadmap for continual improvement.
Lead an innovation agenda, continually seeking to leverage the best techniques to improve the performance.
Project Title : GE Healthcare Middle ware
Company : Tata Consultancy Services
Client : GE Healthcare, Bangalore
Duration : Feb-2011 to Apr 2013
Environment: Informatica 8.6, Oracle and Unix.
Scope - This project mainly deals with the implementation health care products purchases, Equipment Maintaince and services used in the diagnosis, treatment and monitoring of patients and in the development and manufacture of
biopharmaceuticals. This project follows the agile procedure laid out by TCS for
the project life cycle. I gained good knowledge of agile process in this project.
Responsibilities:
Conducted series of discussions with end users and Business analysts to understand the requirements.
Coordinated with functional analysts to design the ETL process according to the business requirements.
Prepared Integrated Design Specification (IDS) and low-level mapping sheet from the requirement specified in Integrated Requirement Specification (IRS) and mapping sheet provided by users.
Presenting the prepared IDS document in Integrated Coaching Session (ICS) for design approval.
Converted the business requirements into technical specifications for ETL process.
Designed and developed complex mappings to handle complex business logic.
Worked with pre and post session SQL commands to drop and recreate the indexes on data warehouse using source qualifier transformation of Informatica Power center.
Coordinated with functional analysts to design the ETL process according to the business requirements.
Preparing unit test scripts and performing Unit testing and UAT for the developed workflows and mappings.
Worked on CDC data coming from transactional systems.
Provided support during the warranty period and knowledge transition to the production support team after the warranty period.
Writing UNIX shell scripts for extracting error data, mailing them, archiving old data, checking the presence of indicator file before running the sessions and as well as creating the needed parameter files in the UNIX backend.
Prepared ETL mapping Documents for every graph and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
Creating Procedures & Functions and views to assist in functionality.
Worked independently in the design, development, testing, implementation and maintenance of systems of moderate size/complexity with a fast turn-around.
Project Title : TCL-ER (Enterprise Reporting)
Company : Tata Consultancy Services, Mumbai
Client : Tata Capital
Duration : Oct-2009 to Jan-2011
Environment: Oracle Ware House Builder (ETL), Oracle 10G, Business Objects
Scope : Tata Capital Ltd, a wholly-owned subsidiary of Tata Sons Limited, is a
Non-Deposit Taking NBFC Undertaking fund and fee-based Activities in the financial services sector. TCL has decided to Implement TCS Bancs Compliance Solution to meet its regulatory compliance requirements Pertaining to Anti Money Laundering. Data From various sources like BANCS, ORBIT, CHANNEL FINANCE, LOS and CEQ data loaded into ER Database and is further used for reporting purpose.
Responsibilities:
Conducted series of discussions with end users to understand their requirements.
Coordinated with functional analysts to design the ETL process according to the business requirements.
Involved heavily in ETL designing and automation.
Designed and implemented performance tuning strategy for OWB mappings, SQL queries and databases.
Developed mappings using various transformations like Filter, Aggregator, Expression, Lookup etc.
Tracking CR and defects in MANTIS and Implementing CR and changing the existing functionality as per requirement.
Creating Procedures, Functions, Materialized views and views to assist in functionality.
Exporting the project in various regions like UAT, Pre-Prod, and Production as and when required.
Designed and developed high level load process and error handling strategies
Developed and implemented Auditing Procedures.
Provided support during the warranty period and knowledge transition to the production support team after the warranty period.
Project Title : TED-PEPR O-Maintenance Support
Company : Tata Consultancy Services, Mumbai
Client : Absi Corporation.
Duration : Mar-2009 to Sep-2009
Environment: Informatica 8.6.1, Oracle 10G, UNIX
Scope : Processing Health Care Records received from Managed Care Support Contractors worldwide, verification of the accuracy of the same and checking the legitimacy of payments. The project mainly involves the Maintenance of the modules which were migrated for Health Care Service Records (HCSR) and TRICARE Encounter Data (TED) data from IBM OS/390(Mainframe System) to Oracle DB.
Responsibilities:
FTP of files received from the contractors to the landing area
Validating the Transmission Files and splitting valid files using AIX Shell Scripting
Loading split data into Oracle Databases
Performing Informatica Mappings to process the received records
Generating reports on the processed data using Oracle Stored Procedures and Cognos
FTP of files back to the contractor
Collecting the requirements for TED - PEPR Maintenance Support Change Requests (CRs).
Analysis of Change Request
Developing and changing Unix shell scripts
Project Title : Virgin Media – Enterprise DWH
Company : Tata Consultancy Services, Mumbai
Client : Equifax, UK
Duration : Apr 2008 to Feb-2009
Environment: Data Stage, Oracle, Unix
Scope : This Project involved creating a central repository of Credit Estimates data such that the data is available for use by internal and external S&P applications through the Ratings Look up Logic (RLL).
Developed ETL mappings using requirement document and technical design documents.
Developed ETL mappings using various transformations.
Involved in certain documentation work like Technical Specification, Release Notes, and Incident Resolution Report etc.
Created Informatic reusable objects (Sessions, Mapplets,Worklets,Transformations)
Scheduled and automated the jobs using Informatica’ s scheduler.
Tuning of Informatica environment including mappings, sessions.
EDUCATION
B. Tech (Electronics) J.N. T University, Hyderabad 2006