Kirankumar Marabanahalli Rangappa
**********@*****.***
Professional Summary
Having 16+( overall 15years) years of experience on Data migration, Data integration and Data warehousing using Informatica 10.X, Teradata BTEQ,Oracle SQL, Unix script.
Completed certifications on AWS Certified Cloud Practitioner and SNOWPRO (snowflake) CORE.
Extensively used ETL methodology for requirements analysis, design, coding, testing, documentation, and Implementation, Production support of ETL using Data warehousing tools.
Extraction, Transformation and Loading (ETL) data from various sources into Target database /Data Warehouses / and Data Marts using Informatica Power Center 9.1/10/2 (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
Understand customer requirement and work on Logical and Physical Data Modeling for stage, integration tables.
Worked on solution approach on extraction of data from Salesforce CRM, FTP/SFTP flat files from remote server, SAP BCI, Oracle GL, DB2 and apply change data capture logic on Oracle stage tables.
Having complete domain and development life cycle of DWH.
Developed PL-SQL Stored Procedures, Views, materialized views and Complex SQL queries on Oracle Database.
Developed reusable transformation, Mapplets, objects in Informatica power center, Creating reusable Batch scripts, UNIX scripts and Database objects.
Worked closely with customers to know more details of the requirements and issues they raise, communicate with them regularly to resolve, and fix the defects within the stipulated Service Level Agreements.
Developed UNIX scripts to download file from remote server, perform sanity check, Files validation, Call informatica jobs from PMCMD command. Having knowledge on Data Quality Management.
Production Support, troubleshooting, resolving on-going maintenance issues and bug fixes. monitoring performance on ETL jobs and performance tuning of Oracle database/ mappings/Workflows in order to reduce run time in nightly batch.
Delivered Design document, mapping document, Visio flow diagram for complex data integration and migration document.
Good working experience with Waterfall & Agile Methodology.
Ability to adjust in new environments, circumstances, with a drive for results through self-motivation.
Technical Skills
Cloud : AWS- EC2, S3, PostgreSQL and Redshift DW, snowflake.
ETL : Informatica Power Center 10.x, 9.x,10.X
OS : Windows, UNIX and
Programming : SQL,PL/SQL, Shell Scripting
Database : RDBMS, Oracle, My SQL Server, snowflake, AWS Postgres, Redshift, Teradata
Database Tools : TOAD, SQL Navigator, SQL Developer.
Other tools : Erwin, Bit Bucket, Control M scheduling tool.
Education:
Master of Computer Application (MCA)
University: Visvesvaraya Technological University, Belgaum,Karnataka, India
Year: 2005
Employment History
Employer
Designation
EMPID
Start and end date
HCL Technologies Ltd, India
Software Engineer
40184022
19-July-2007 to 16-July-2010
Tech Mahindra Limited, India
Sr. Software Engineer
289627
03-Aug-2010 to 17-Feb-2014
Deloitte Consulting Pvt. Ltd.
Sr. Consultant
380029
24-Feb-2014 to 28-Jul-2020
Apptad Inc
Informatica Lead Developer
Aug-2020 to Dec-2020
ERP Marks Inc
Jan 2021 to Jul 2022
Syrainfotek llc (CloudQ)
ETL Developer/Support
Jul 2022 to Dec 2023
Project details:
Client: Elevance health, Atlanta, GA Feb 2024 – Till Date
Role: senior Software Developer
Project: Enterprise data warehouse EDWard
Subject Area: Claims
Technical Environment: Informatica Power center 10X, Teradata,Bteq scripts, UNIX shell script.
Description:
ElevanceHealth/Anthem, Inc. is one of the largest health benefits companies in the United States. Through its affiliated health plans, Anthem companies deliver a number of leading health benefit solutions through a broad portfolio of integrated health care plans and related services, along with a wide range of specialty products such as life and disability insurance benefits, dental, vision, behavioral health benefit services, as well as long term care insurance and flexible spending accounts.
This Project comprises of providing the data solutions to the existing system and also provides enhancements to the enterprise data warehouse (Edward). Edward is the single source of truth for several downstream applications such as RHI, ERISA, Product Analytics, EGD, MMHPlus Applications etc., These applications use Informatica ETL to pull the data from different Source Systems and uses Teradata BTEQ scripts to load data into final EDWard tables.
Responsibilities:
• Understanding the business model and customer relationship requirements from the Teradata system to edw database.
• Used Informatica to design and develop the mapping for extracting, transforming, and loading the data into source to Landing zone tables.
• Working on Bteq scripts while loading to multiple table loads.Landing zone to PP tables and then loading into Confirmed Staging Area using Teradata Bteq scripts and UNIX scripting.
• Involving around workflows to create mapping for those with WellPoint standards.
• Involved in creating design documents and test results docs and worked with shell scripting.
• Understanding the Business Specifications Document and designing the Technical System Design Documents.
• Enhanced/Developed mappings that perform Extraction, Transformation and load of source data into target systems using various transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.
• Quality Assurance - Prepare the Unit Test Case and Support the testing team prepare and validate the SIT (System Integration Testing) Test case with Business and Users.
• Reviewing work products and deploying the application in the test environment and initiating the process of UAT (User Acceptance Testing).
• Production deployment and warranty – Assist the Production implementation to validate the components, execute the job in Production. Monitoring day-today processes, timely fixing production problems if encountered.
• Regular interaction with Managers, Product owners, Stake holders and offshore members to ensure the smooth running of applications.
Client: Elevance health, Atlanta, GA Jan 2021 to Dec 2023
Role: senior Software Developer/ Onsite lead-1
Project: SPS data migration
Technical Environment: Informatica Power center 10X, Oracle,SQL-PLS SQL, UNIX, EFX, Bitbucket.
Description:
The client is one of biggest insurance company (healthcare). The current existed system has lot of inconsistent and bad data. Objective of project is migrating current provider system to new SPS As part of this project, Extract provider from GA, CPMF, ACES, CRF, GBD, CPF and R6 source system data from common file format, apply required transformation as per business rules and load them to landing zone table. Extract data from Landing zone, apply translation as per SPS codes and error handing and load them to new SPS system.
Responsibilities:
Worked on provider data migration using Oracle SQL and PL SQL, Informatica Power Center 10.4 on UNIX operating system.
Worked in a data load team to execute the end-to-end jobs and delivering the data downstream.
Worked on setting up non-prod environment in control-M, database refresh from prod and PY/UNIX Deployment activity, so that respective team can execute the DART load end to end before prod execution.
PERF/UAT/SIT Support, troubleshooting, resolving on-going maintenance issues and bug fixes. monitoring performance on ETL jobs and performance tuning of Oracle database/ mappings/Workflows to reduce run time during the data migration.
Closely worked with the Control-M team to create around 300 control-M jobs with dependency.
Closely worked with the MATCH team to resolve the any issue with Match code.
Closely worked with DevOps team to create Bitbucket and bamboo plan for all the projects like PRICING, VAMES, SCW, DART, DART-DI, PRICING, DI, BEACON, OHCAID, GACAID and UIX project.
Co-ordination on deployment activity in SIT/UAT/PERF and PROD for Projects like DART, DART-DI, PRICING, VAMES, SCW, DI, BEACON, OHCAID, GACAID and UIX code deployment into SIT/UAT/PERF and PROD.
Client: State of WI, Madison, WI Sep 2019 to till July 2020
Role: Senior ETL Informatica Developer - Lead
Project: CARES-IMMR
Technical Environment: Informatica Power center 10X, Oracle SQL and PLSQL, UNIX, EFX.
Description: The Income Maintenance Management Reporting (IMMR) is a multi-layer enterprise data warehouse, business intelligence and reporting solution, which supports the State’s regionalized operating model for eligibility determination of public assistance programs. IMMR provides a business intelligence, reporting and analytics platform for DHS and County Agency stakeholders to answer critical business questions to improve program operations and make strategic decisions.
Responsibilities:
Analyzing the clients Data Warehousing needs and identifying the distinct functional areas.
Interpreting all complex data on target systems to provide resolutions for reporting requirements.
Designing the data model in Oracle for new set of tables by discussing with Data Architect.
Coordinating with DBA team to design datasets, tables, views in data sources like Oracle, SQL server & DB2.
Design and customize data models for Enterprise Data warehouse supporting data from multiple sources. This includes Relational and Dimensional Data Modelling using Star & Snow Flake schema, De normalization, Normalization, and Aggregations.
Developing mappings to extract data from DB2, SQL Server, Oracle, Flat files, and load into Data warehouse using the Mapping Designer.
Developing Mappings using various Transformations like Aggregate, Expression, and Filter, Lookup, Joiner, Sequence Generator, Stored Procedure, Update strategy and Rank.
Using workflow manager extensively to create tasks, work lets, and workflows and executing the tasks.
Create Technical Specification documents for ETL Mappings.
Monitoring the database performance and implementing the session partitions and made changes to database by re-building the indexes.
Implementing Slowly Changing Dimensions - Type I, II mappings as per the requirements.
Understanding the business requirements in depth and convert them into design specifications.
Create complex queries for use as Derived Tables and Materialized Views, which are used for generating reports.
Developed PL-SQL blocks to implement audit jobs and some other miscellaneous activity.
Client: Anthem, Atlanta, GA May 2018 to Aug 2019
Role: Senior ETL Informatica Developer – Lead
Technical Environment: Informatica Power center 10X, Oracle, UNIX, DB2, EFX.
Project: DDS-WGS Specialty (WGS E&B)
Description:
The client is one of biggest insurance company (healthcare). Current existed system has lot of inconsistent and bad data.
Objective of project is migrating Dental specialty members from DDS system to WGS system in order to make one ID card and billing admin activity for both dental and medical system.
There are 5 keys entities are migrated from DDS (Dental care) to WGS system, that is Case, Billing entity, Delinquency, group and members. Specialty member was the key activity in migration, that’s is before migrating to WGS, do member match process based on demographic details and if member exists reuse same HCID, else created new HCID migrate them to WGS system.
Responsibilities:
Created design doc, Data model and Informatica jobs and reviewed with client and got approval to move to development phase.
Onshore co-ordination with DDS system and offshore team for all ETL jobs requirements.
Designed, implemented and maintained ETL platforms.
Developed ETL architecture in accordance with business requirements.
Provided proper resolutions for ETL related issues.
Maintained documentations for ETL processes.
Assisted in development of ETL architectural standards.
Involved in code migration from lower region to higher region.
Involved in production execution and make sure all audit and validation reports are valid
Client: Anthem-, Atlanta, GA Nov 2017 to May 2018
Role: Senior ETL Developer
Project: SPS-GBD data migration
Technical Environment: Informatica Power center 10X, Oracle, UNIX, DB2, EFX.
Description:
The client is one of biggest insurance company (healthcare). Current existed system has lot of inconsistent and bad data.
Objective of project is migrate current provider system to new SPS As part of this project, Extract GBD and Multiplan source system data from common file format, apply required transformation as per business rules and load them to landing zone table. Extract data from Landing zone, apply translation as per SPS codes and error handing and load them to new SPS system.
Responsibilities:
Interacted with other source vendor and data model team to understand existed system and new SPS data model.
Created many complex informatica jobs to migrate data from legacy source system to new SPS system.
Created SQL to apply many business rules in source qualifier.
Create UNIX shell script to validate and archive CFF files.
Client: Anthem-, Atlanta, GA Aug 2017 to Nov 2017
IDQ Developer / ETL Developer
Project: Strategic provider system (SPS)- Data profile
Technical Environment: Informatica Power center 10X, Oracle, UNIX, DB2, EFX.
Description:
Description: The client is one of biggest insurance company (healthcare). Objective of project is to create new SPS system database with consistent data from EPDSV2 database. Current EPDSV2 database has lot bad and invalid data, as part of data profile team we are using Informatica IDQ to Profile data and run score card for valid and invalid data.
Responsibilities:
Interacted with other team to get set up environment for Informatica IDQ.
Created Informatica mapping, workflows as per requirement.
Created profile for required data set tables, Files.
Created score card on the top of profiles to present valid and invalid data as per requirement.
Created SQL for various scenarios and complex rules as part of manual data profile.
Client: Madison Square Garden, New York NY Jun 2016 to Jun 2017
Senior ETL Developer - Lead
Project: MSGBI
Technical Environment: AWS SAP data services, Postgres,Redshift and UNIX OS.
Description:
The Madison Square Garden entertainment industry company which has their own stadium, and own basketball team called Rangers, Knicks, and Liberty and they use to conduct many sports and entertainment events. Ticket master is thirty party team who sells MSG event tickets in online.
Objective of project is create sales reports by getting data from Ticket master team in the form of flat file and load data to stage, Dimension and fact tables. Reports are generated on the top of Dimension and fact using tableau tool.
Responsibilities:
ETL lead from offshore to take care daily production loads.
Created many shell scripts and called through AWS SAP data services ETL tool.
Created shell script copy source files to amazon S3 Bucket and load files to stage tables from S3 Bucket.
Created SAP data services jobs to create dynamic SQL delete and insert SQL, which loads many stage tables in Postgres and Redshift database.
Created Postgres SQL to extract data from many stage tables as per business requirement.
Created Postgres SQL to validate data as per business rules.
Client: ThermoFisher, Carlsbad, CA (Offshore) Nov 2015 to May 2016
Informatica Developer
Project: Sales at cost (SAC)
Description:
The Thermo-fisher Customer has many dealers across NA and EU countries. We were getting Customer details, product detail which is sold and its sold amount from different dealers across NA and EU countries. The data represents what was sold through the channel on a calendar basis. Objective of project is Automation of data loading and data quality checks will address known data quality shortcomings, discover hidden data quality issues and creates a consistent, ongoing process for monitoring and improving data quality.
Responsibilities:
Worked closely with Business analysist to get requirement.
Worked on solution approach and design doc to bring data from different dealers to SAC main table.
Worked on DDL for STAGE and MAIN tables. Worked on Informatica data from flat file to STAGE table.
Worked on SAC type 2 informatica mapping development to bring a data from STAGE to MAIN table.
Performed UNIT Testing for ETL and share Test results with BA.
Closely worked with business analysist during integration testing.
Prepared Deployment script to move code from DEV to TEST and TEST to PROD region.
Informatica administration to move code from DEV to Test region.
Client: Thermo Fisher, Carlsbad, CA (Offshore) Feb- 2015 to May 2016
Role : ETL Developer
Project Name: Helios- BI
Technical Environment: Informatica Power center 10X, Oracle PL-SQL, UNIX, EFX.
Description:.
Main Objective of Helios project is integration of SAP(Global) and SAP(German) into Enterprise Datawarehousing (EDW) for various subject area. Business content integration ( BCI) is method which had been used to bring SAP data to EDW. There were two set of ETL. ETL1 Brings data from SAP using BCI to stage tables. ETL2 extract data from Stage tables to Dimension and fact tables. There were 2 types of load from SAP to EDW. Initially FULL load and later on incremental load.
Responsibilities:
Worked on solution approach to bring data from SAP to EDW.
Worked on DDL for STAGE and DIMENSION tables.
Worked on creating complex VIEWS which has logic to perform Change data Capture (CDC).
Worked on informatica mapping development to bring a data from SAP to STAGE
Worked on informatica mapping development to bring a data from STAGE to DIMENSION and FACT.
Worked on PL-SQL stored procedure pull data from stage and update else insert on dimension table using HCC in Oracle Exadata database.
Performed UNIT Testing for ETL and share Test results with BA.
Prepared Deployment script to move code from DEV to TEST and TEST to PROD region.
Informatica administration to move code from DEV to Test region.
Client: Thermo Fisher, Carlsbad, CA (Offshore) 2014 to Feb-2015
Role : ETL DeveloperApr
Project Name: QTC
Technical Environment: Informatica Power center 10X, Oracle, UNIX, EFX.
Description:
Life Technology is healthcare company, Now it is been take over by Thermo fisher. Life tech and Thermo fisher has many customer and products. There are many same customer are available in both. As part of this project we are cleansing and Matching customer, finding similar customer address of customer using Informatica Data quality (IDQ) tool. The main objective of this project is consolidation of customer from lift tech and Thermo fisher into customer level view database.
Responsibilities:
Worked on solution approach to integrate customer from E1 and SAP to E1 Database.
Worked on Data modeling.
Worked on informatica developer to create cleans and matches ETL jobs.
Worked on ETL jobs to which clean customer data.
Worked on ETL jobs which standardize customer data.
Worked on ETL jobs which Matches customer E1 with SAP customer.
Worked on moved ETL jobs from Dev to QA.
Worked on Preparing Test Case scripts for cleans and Match.
Worked on Bugs fixing which are raised by USER in QA region.
Prepared Design doc, Implementation doc and Std operating procedure doc.
Project Name: GE Capital Rail CCAR (Offshore)
Client : GE Capital
Duration : Mar 2013 to Feb 2014
Role : ETL Developer / module lead
Tools: Informatica power center 9.1.1, Teradata, Oracle.
Team size : 10
Description:
As part of the larger CCAR initiative which is a supervisory assessment by the Federal Reserve for the capital planning processes and capital adequacy of these large, complex bank holding companies (BHCs), GE Rail needs to fulfill two major obligations under the 2013 program. These will be building additional fields to existing system of record (SOR) to support CCAR reporting and re-architecting the set of business intelligence systems that support ERI reporting to achieve A) full traceability from SOR to Rail DWH to ERI Reporting Model to IRIS submission B) DWH storage of IRIS submission and C) Submission to IRIS in the standard 12 file format. This project will include re-architecting the current process to achieve the goals above.
Responsibilities:
Worked (ETL) solution approach as per requirement, re-architecting the current process.
Worked on Design doc to integrate Finance lease new source system with existed system.
Worked Design doc for Risk classification which was done in Existed Macro excels.
Worked on ETL mapping to load data to table from flat file using Multi load and fast load utilities.
Worked on deriving new attribute in SOR and applied SCD type 2 loading.
Worked on reusable shell script to Archive manual source files.
Worked on creating Teradata views by joining multiple tables based on condition.
Performed unit Test for informatica ETL Jobs.
Prepared Unit Test Case script and Test results for Informatica ETL jobs.
Performed code migration from DEV Region to UAT Region.
Performed UAT Support for informatica ETL Jobs.
Prepared Implementation document for Production code migration.
Prepared standard operating procedure (Manual Book) Document for Project.
Project Name: Customer Data management
Client : GE Capital Australia (Onshore and Offshore)
Duration : Jan 2012 to Dec-2012
Role : Module Lead/ Team Lead
Tools: Informatica power center 9.1.1, Oracle Exadata, Teradata
Team size : 18
Description:
Main objective of this project is reducing data quality management costs through bringing the Acxiom function in-house as well as improving the timeliness of the same.
CLV is the database which stores consolidated customer related data of different source system. These data are extracted from different ODS system like ICBS,VisionPlus256 and 823 and stored in STG tables and Extract data from STG to flat files based on date range. These flat files are used in cleansing and standardizing address. Finally cleans, Standardized and Consolidated customer data will be stored in CLV database.
Responsibilities:
On Site co-ordination for understanding the requirement and project flow. Providing solution approach to Developer as per requirement.
Merging of daily and Quarterly refresh ETL jobs for all account and application related data.
Applied CDC logic which extracting data from ODS to STG.
Introduced Trigger table concept to process any customer records with force.
Developed mapping SCD Type 1 and SCD type 2 load for customer address history and contact history details in CLV database tables.
Performed unit Test for informatica ETL Jobs.
Prepared Unit Test Case script and Test results for Informatica ETL jobs.
Performed code migration from DEV Region to UAT Region.
Performed UAT Support for informatica ETL Jobs.
Prepared Implementation document for Production code migration.
Prepared Hand-Over Document for Project.
Project Name :Integration of Salesforce data into DWH
Client : GE Capital ANZ (Offshore)
Duration : March 2011 to Nov 2011
Role : Team Member/ Module Lead
Technologies: Informatica power center 9.1.1, Oracle Exadata, Teradata
Team size : 2
Description:
Main objective is to build the SalesForce Objects in the data warehouse. All Standard and Custom objects would be made available in the DWH. The Data warehouse would synchronize the data with SalesForce initially on a daily basis, but would be built in a way so that it can also accommodate refresh on demand. Main objective of this project involved replicating the objects from the SalesForce system to the data warehouse. SalesForce is initial data load would be a full load. All further data loads would be incremental, these incremental loads would follow the insert else update methodology.
Responsibilities:
Developing mapping to perform incremental extraction from Sales Force and apply insert else update methodology.
Developing mapping to perform reconciliation of SalesForce and DWH database.
Developed PL-SQL block to perform control table toad based on status of session run.
Developing workflow to run set of session according to user requirement using Command task, Decision task, Event wait, assignment task.
Performed unit Test for informatica ETL Jobs.
Prepared Unit Test Case script and Test results for Informatica ETL jobs.
Performed code migration from DEV Region to UAT Region.
Performed UAT Support for informatica ETL Jobs.
Prepared Implementation document for Production code migration.
Prepared Hand-Over Document for Project.
Project Name : NCCP Responsible Lending
Client : GE Capital, ANZ (Offshore)
Duration : Jan 2011 to May 2011
Role : Team Member
Technologies : Informatica power center 9.1.1, Oracle Exadata
Team size : 3
Description:
NCCP is mainly to fulfill the business requirements of the GE money across all the locations wherever it is operating. It gives the Data warehouse and BI solutions for the business. So shortly it is the Enterprise Data warehouse product for the GE money. Basically GE money runs the financial banking which involves different products like Insurance, Revolving.
Responsibility:
Involved in extract and cleansing the data from the flat files into staging area.
Developed the Informatica Mappings by usage of source qualifier, expression, Aggregator, lookup, Update strategy, router transformation, filter transformation Etc etc..
Configuration of sessions in workflow using command task, decision task, assignment task.
Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
Project Name : DB-FISS
Client : Deutsche Bank, NJ (Offshore)
Duration : Feb-2009 to Jul-2010
Role : Team Member
Technologies : Informatica 8.1.1,Oracle and Unix
Team size : 6
Description:
SBO 2000 is data warehouse for residential mortgage loan system for New York. The main adjective of project is to provide analysis report based on each and every loan and cusip. Reports are used by higher management for trend analysis and decision making in organization. The main ETL functionality is Data extraction from source system and load staging table. Extract data from staging table and load Dimension and fact table.
Responsibility:
Performing data cleansing and transforming using Informatica.
Developing mapping to load data from staging to Dimension and Fact tables.
Developing workflow to run set of session according to user requirement.
Responding to issues raised by users within defined time-scales and communication channels set by business.
Investigate and provide workaround and resolution to problems and queries relating directly to incidents raised through GIMS. Developing shell scripts to perform file processing, archiving files, compressing data files, compressing log files to reduce disk space, automation of manual task to improve productivity.
Project Name: DB-Feeds
Client : Deutsche Bank, London (Offshore)
Duration : Jul 2007 to Jan 2009
Role : Team Member
Technologies: Informatica 8.1.1, Oracle and Unix
Team size : 10
Description:
The DBFeeds project is a strategic initiative by Deutsche Bank to consolidate the data, which is spread over 3000 data sources, many of which are redundant because of local use of applications, to an optimal number of data sources. This effort has to be accomplished using an ETL tool. Informatica Power centre, which is DB’s strategic ETL tool.The main ETL functionality is Data extraction from source system and load staging table. Extract data from staging table and consolidate data as per requirement based on reference Golden source produce output data files to TLM, SAP,IT-CON system.
Responsibility:
Developing mapping to load data from source system to target system according to requirement.
Developing workflow to run set of session according to user requirement.
Developing shell scripts to perform file processing, archiving files, compressing data files.
Responding to issues raised by users within defined time-scales and communication channels set by business.
Investigate and provide workaround and resolution to problems and queries relating directly to incidents raised through GIMS.