Umesh Kumar
E-mail: **.************@*****.**
Current Location: Louisville, KY
Availability: 2 Weeks
PROFESSIONAL SUMMARY:
●10+ years of IT Experience in Software Analysis, Design, Development, Testing and Implementation of Client/Server Business systems like Business Intelligence and Data warehousing.
●Translate requirements into various documentation deliverables such as functional specifications, workflow /process diagrams, and data flow/data model diagrams.
●Experience in Informatica Client tools: Power Center Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
●Experience in all phases of Software Development Life Cycle (SDLC) such as development, testing, migration, administration, and production support on various platforms like UNIX, Windows 10.
●Experience in gathering, analysis and documenting business requirements, functional requirements. Designing and developing the mapping based on the requirements.
●Experience in implementing the Business rules by creating transformations (Expression, Aggregate, Unconnected and Connected Lookup, Router, Update Strategy, Filter, Joiner, Union), and developing Mappings.
●Strong knowledge on Data warehousing concepts, Data mart, Star Schema and Snow Flake Schema modelling, Fact tables and Dimensional tables.
●Implemented Slowly Changing Dimension methodology for accessing the full history of accounts and transaction information.
●Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.
●Experience in Migration of objects between environments, from Local (DEV) to Test (QA).
●Experience in writing complex SQL queries and Unix Shell Scripting.
●Excellent experience in Performance tuning in SQL and query optimization.
●Strong knowledge on Relational databases like Oracle 9i/10g/11g/12c/19c, SQL Server, Azure Synapse, Azure Hyperscale on different platforms like Windows/Unix Linux using GUI tools like TOAD, PL/SQL,SSMS Developer.
●Analytical and Technical aptitude with the ability to solve complex problems.
●Strong interpersonal skills with ability to interact with end-users, managers and technical personnel.
●Well versed in Agile and Waterfall Methodologies.
●Experienced and well versed in writing UNIX Shell scripts as part of file manipulation, scheduling and text processing.
●Worked as Data Modeler for database creation for data migration.
●Involved in Unit Testing, Integration testing and QA Validation.
●Experience in designing and developing database applications and data support systems (ETL/ELT) for Data Warehouse / Data Mart databases, Data Staging, Operational Data Stores and Data summarization for Decision Support Systems (OLAP/BI/Analytics) of various data volumes employing data models, requirement specifications, functional specifications, business rules, validation rules, system statistics, etc.
●Experience in design and development of various Data Integration, Data Migration and ETL applications in different types of SDLC processes employing Data mappings, UNIX shell scripting / Windows scripting, Database programming such as UDF, stored procedures, functions, SQL scripting.
●Experience in various data profiling methods to perform analysis of structured, semi-structured and unstructured data sources and data consumables.
●Experience in application metadata gathering and metadata analysis of system configuration, application complexity, runtime statistics, etc.
●Experience using Informatica Power Center / Power Exchange / Analyzer / Developer, Oracle Warehouse Builder, External Loaders such as Oracle SQL*Loader, PERL, etc.
●Experience in Data Masking, Data Subset for non-production data environments with Informatica Power Center.
●Experience in Informatica Power Center Load balancer and HA clustered systems. Involved in process recovery and dynamic partitioning solutions in multi-node Environment.
●Thorough understanding of mapping / transformation rules, use cases, data models / dictionaries and other artifacts constituting database and backend application systems design and development.
●Conduct Gap analysis on existing processes and potential alternatives. Identify and report on trends and patterns found within the data.
●Used logical data models, physical designs, and business rules to create mapping definitions.
●Analyze complex business requirements and design, build, and implement technology enabled solutions to address multi-discipline business opportunities/requirements.
●Investigate, analyze and address complex technical problems within the ETL processes.
EDUCATIONAL QUALIFICATION:
●B-Tech from Kurukshetra University, Haryana in Computer Science & Engineering (Batch 2009).
WORK EXPERIENCE SUMMARY:
Humana Inc., Louisville, KY (Tata Consultancy Services) April 2019– Till Date
HERO (Humana Encounter Resolution and Operation)
Azure Synapse, Hyperscale, Oracle SQL, Informatica Tech Lead /Architect
The HERO Project is to replace its current Edifecs Encounter Submissions platform due to unsupportable technology components and a closed architecture that limits the business ability to accommodate changing requirements, streamline management of errors and scale without significant workarounds. The current outdated and inflexible application poses significant risk to the successful execution of the Humana encounter management process, meeting SLAs and the revenue associated with the process. New solution was treated to be a single place for all error corrections. HERO is the new solution proposed for Humana’s Encounter Creation, Submission, Error Management and related functions. Humana seeks to improve their current method of creating and submitting Electronic Data Interchange (EDI), “Encounter” data required for Medicare and Medicaid by different trading partners like CMS, Florida etc. The proposed architecture is an n-tier architecture based on principles of Service Oriented Architecture. The architecture promotes loose coupling of components and utilizes azure cloud products to provide for the following key tenets:
A technology platform that addresses the monolithic nature of the current system and moves to a solution that efficiently and effectively facilitates maintaining, making changes and enhancements for the solution based on business demands and needs.
Enabling of robust process & service orchestration with industry standard business & compliance validations using modern design patterns and optimizing all processes involved in end-to-end lifecycle of Encounter and related claims.
Provide modular User Interface to monitor the encounter with agility to adapt/change the rules based on changing business needs of the trading business partners.
Improve efficiency and accuracy in Encounter creation process with auto correction capability.
In the event human intervention is required for correction, build a flexible error correction management workflow with ability for effective workload management while providing visibility on the error corrections.
Responsibilities:-
●Design and development, unit testing, integration, deployment packaging and checkout, scheduling, of various components in Informatica Power Center and Azure Cloud through several SDLCs for implementing ETL processes for cloud based very large data warehouse and ODS integrated with data for subject areas from disparate data sources and processes supporting downstream data marts for DSS/BI applications.
●Performed analysis of ETL applications and database systems, data structure analysis, data profiling for application design and defects resolution.
●Performed process enhancements such as capacity improvement, tuning/optimization and performance improvement of the Azure Hyperscale based ETL/Upsert processes.
●Performed extensive metadata gathering and analysis of database and metadata repositories for understanding data lineage, data flow, process design, process statistics and identifying storage elements and access to data using JSON.
●Partnered with DBA/SA to optimize database tables/indexes, partitioning, parallelization and in implementing database/client configurations to support pipeline partitioning and multi-threaded processing.
●Performed system analysis and created both functional and technical specifications documents. Designed and developed custom SQL packages and procedures.
●Assisted junior developers with design and development of SQL packages, Forms and Reports. Modified forms and reports for performance improvements.
●In the event human intervention is required for correction, build a flexible error correction management workflow with ability for effective workload management while providing visibility on the error corrections.
●Streamlined SDLC processes by automating data sampling and data comparison for testing and analysis of source and target data for ETL application development, QA and UAT.
●Re-Platforming: Involved in database migration from Edifecs to HERO for applications above and Appraisal Data Analytics requiring, rapid conversion methodologies and integration conversion solutions to convert legacy ETL processes into Azure synapse compatible architecture using regular expression driven metadata conversion such as ODBC/ JDBC reads and OLAP writers and replacements such as bulk merges, deletes with OLAP model.
●Performance Tuning: Enhanced and troubleshoot ETL process runtime contingencies applying Database tuning, query tuning and application tuning solutions.
●Design and Architecture, ETL and Data application systems and frameworks: Performed requirements analysis, systems analysis, data analysis and designed application systems and framework architecture incorporating/implementing functional and non-functional systems specifications governed by enterprise business rules, security policies, environmental and application frameworks, delivered in both agile adopted and non-agile SDLC programs.
●Performed service requests, access requests, deployment requests, role/user requests, operations requests etc. and related workflow management with Service Now.
●ADLS: Data ingestion into ADLS (Storing the complete files into ADLS and path in Database table) for Space optimization.
Environment: Azure Synapse, Azure Hyperscale, SQL Server, Informatica Power Center 9.6/9.5/9.1, Oracle 12c/11g, Toad/SQL Developer, SQL scripting/tuning, PL/SQL, XML, Solaris 10, Unix Shell Scripting
Humana Inc., Louisville, KY Jan 2017 – April 2019
BOD (Board Of Director Report)
Oracle SQL Lead/ Informatica Lead Developer
The objective of this project was to make Humana (The organization) 20% healthier by 2020.The Reports generated catered to business needs of accurate, actionable end-to-end reporting for Commercial, Humana One, Medicaid and Medicare members to make Members 20% healthier by 2020. Board of Director Report is a highly interactive web Application, self-serving data visualization which makes it easier to understand the communities we serve from a variety of lenses – geographic, product and benefits, chronic conditions and comorbidities, provider groups and specific impactful metrics for each of the chronic conditions. Over 100 metrics with monthly trends are presented in this tool which enables us to track the performance of various initiatives and align and improve strategies to achieve the Bold Goal 2020.
The Board of Director report has been designed in collaboration with Clinical Analytics, Enterprise Technology, Market Leadership, and the Office of the Chief Medical Officer / Bold Goal. Its purpose is to support Market Leadership and other stakeholders in identifying key performance indicators that are correlated with Healthy Days. By identifying these opportunities, leadership is able to align initiatives and other collective impact strategies to measurable key performance indicators (KPIs).
Responsibilities:-
●Conceptualized Audit and Control Framework for ETL application system for Enterprise Data Warehouse initiative at SSA on Informatica Power Center platform. Implemented the framework by designing and developing ETL components in Informatica, Oracle and shell scripts.
●Lead ETL development as SME in Informatica Power Center and Power Exchange application development by providing expert solutions for ETL application development and data warehouse development for EDW program that also involved analysis of Informatica Metadata, data profiling of the warehouse, data distribution in the MPP database, etc.
●Performed setting up and configuration of application services for EDW and DCPS application programs. Created maintenance tasks by developing domain backup, repository backup, file purging, activity logging, etc. scripts. Configured security by folder configuration, LDAP configuration, security domains, OS profiles, etc. Implemented SSL for Informatica Power Exchange for external bulk load utilities.
●Troubleshoot with Informatica to rebrand the ODBC drivers by identifying parallel performance and driver error issues.
●Implementing Data Security Policies in the Informatica Dynamic Data Masking server.
●Configuring DDM environments planned for enterprise wide databases classified by data applications.
●Developing and configuring DDM services, connection rules, and security rules for clustered DDM server nodes for Partial, Full and No Masking policies and user access classification.
●Developing Informatica DDM best practices for data security policy management, maintenance and deployment that requires migration, replication and synchronization of security rules, domains and services.
●Troubleshoot QA defects in rule matchers and rule actions for data masking and rewrites for SQL requests containing simple and nested queries. Masking for multi-result sets stored procedure database requests.
●Improved performance with best possible optimal combination of Informatica, UNIX Shell Scripts and database SQL or Procedure objects by choosing the better over the lesser.
●Improved performance of ETL processes by employing performance bottleneck resolutions to Informatica Mappings, configuring Transformations.
●Implemented business rules for ETL and ELT using combination of transformations in Informatica mappings, UNIX shell scripting and Oracle SQL PL/SQL Stored Procedure components for data transformation, data load / update strategies for data warehouse to assist high performance data mining activities.
●Used Power Exchange Change Data Capture and bulk extraction for Oracle and mainframe, Informatica Data Quality (IDQ).
●Participated in ETL system audits with Informatica to ensure feasible standards and practices.
Environment: Informatica Power Center 9.6/9.5/9.1, Oracle 12c/11g, Toad/SQL Developer, SQL scripting/tuning, PL/SQL, XML, Solaris 10, Unix Shell Scripting,
Humana Inc., Louisville, KY July 2014 – Dec 2016
HEDIS
Oracle SQL Lead/Informatica Lead Developer
The objective of the project was to design and develop Healthcare Effectiveness and Data Information Set Reports to provide data analysis for Members, Enrollment, Claims, Pharmacy and Product information. The ETL Process involved extraction and migration of the data from SQL server and Flat files, implementing the business logic and populating the data into the target DataMart (Oracle).
Responsibilities:-
●Extracted data from various heterogeneous sources like Oracle, SQL Server, Flat Files
●Extensively used Informatica Client tools - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
●Developed complex mapping using Informatica Power Center tool.
●Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
●Created Sessions, Tasks, Workflows and worklets using Workflow manager.
●Worked with Data modeler in developing STAR Schemas
●Involved in performance tuning and query optimization.
●Used TOAD, SQL Developer to develop and debug procedures and packages.
●Involved in developing the Deployment groups for deploying the code between various environments (Dev, QA)
●Created pre sql and post sql scripts which need to be run at Informatica level.
●Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
●Used Debugger to fix the defects/ errors and data issues.
●Expertise in using both connected and unconnected Lookup Transformations.
●Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
●Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
●Monitored and improved query performance by creating views, indexes, hints and sub queries
●Extensively involved in enhancing and managing Unix Shell Scripts.
●Developed workflow dependency in Informatica using Event Wait Task, Command Wait.
Environment: Informatica Power Center 9.1.0, Oracle 11g, SQL Server 2008, MS Access 2010, SQL*Loader, UNIX, Putty, SQL
HUMANA Jan 2013 - June 2014
EDW
Oracle SQL/Informatica ETL Developer
Developed and maintained ETL maps to Extract, Transform and Load data from various data sources to the Enterprise Data warehouse. The project aims to make decisions for the business users.
Responsibilities:-
●Involved in Dimensional modeling of the Data warehouse and design the business process, dimensions and measured facts.
●Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
●Developed number of complex Informatica mappings, mapplets, and reusable transformations to implement the business logic and to load the data incrementally.
●Worked on performance tuning of SQL and mappings by usage of SQL Overrides in Lookups, Source filter in Source Qualifier and data flow management into multiple targets using Router transformations.
●As per the requirement of the business users to manage change data capture, implemented slowly Changing Dimensions type I and type II.
●Worked with Persistent Caches for Conformed Dimensions for the better performance and faster data load to the data warehouse.
●Used Power Center Workflow manager for session management, database connection management and scheduling of jobs to be run.
●Developed Static and Dynamic Parameter Files for reusability and database connection management among Development/Testing/Production environments.
Environment: Informatica Power Center 9.1.0, Oracle 10g, SQL Server 2005, SQL*Loader, SQL, TOAD, UNIX and MS Office.
Humana, Louisville August 2012-Dec 2012
PPI-Physicians performance Interactive
Informatica Developer
Humana Inc. HYPERLINK "http://en.wikipedia.org/wiki/Managed_health_care" \o "Managed health care" managed health care company that markets and administers health insurance is fortune 500 company which is mainly dealing with the health insurance and Claims data,which contains tera bytes of data to generate reports out of the system And having wide warehouses related to insurance, PPI generated the files to the third party vendor which deals in giving the NCQA standards to the firm based on the data provided.
Responsibilities:-
● Extracted data from various heterogeneous sources like Oracle, SQL Server, MS Access and Flat files.
●Experience on working with complete Software Development Life Cycle of the application.
●Involved in monitoring and maintenance of the Unix Server performance.
●Involved in creating database tables, views, triggers.
●Created many SQL scripts and executed them through Unix Shell scripts.
●Worked on Designer tools like Source analyzer, Warehouse designer, Transformation developer, Mapplet designer and Mapping designer.
●Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
●Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev Environment.
●Optimized SQL queries for better performance.
●Created pre sql and post sql scripts which need to be run at Informatica level.
●Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
Environment: Informatica Power Center 9.1.0,Oracle 10g, PL/SQL Developer, Data Flux, BPM (Business Process Management), Window XP, DB2, UNIX
Humana, Anvita, KY Dec 2011 –July 2012
Informatica ETL Developer
The project involved in the development and implementation of data transfer from relational Database and flat files to the Oracle Database in the Business Intelligence Environment.
Responsibilities:-
●Designed, developed and documented multiple interfaces using the Informatica Power Center.
●Developed many complex mappings and mapplets using various transformations (Source Qualifier, Joiner, Update Strategy, Lookup, Rank, Expressions, and Aggregator) for loading the data into Data Warehouse.
●Involved in designing the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units information in tables.
●Created various Reusable and Non-Reusable tasks like Session and other tasks like Decision and email tasks.
●Performance monitoring and tuning.
●Responsible for monitoring sessions that are running, scheduled, completed and failed.
●Used Informatica Server Manager to create, schedule, monitor sessions and send pre and post session emails to communicate success or failure of session execution.
Environment: Informatica 9.1.0, Oracle 9i, MS-Access, Window NT
SI - State Immunization Jan 2011 to Dec 2011
Oracle SQL Developer (Noida offshore)
This system provides vaccination details of individual who are below 17 year of age to keep track on all the vaccination details of children by US government. We use to upload the files after all ETL operation to their registry site.
Responsibilities:-
●Analyzed business requirements and worked closely with the various application teams and business teams’ to understand business requirements and the source data.
●Involved in designing logical models for staging, production warehouses.
●Designed and developed several ETL scripts using Informatica, UNIX shell scripts.
●Extensively used all the features of Informatica including Designer, Workflow manager and Repository Manager, Workflow monitor.
●Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
●Worked with mappings using expressions, aggregators, filters, lookup, update strategy and stored procedures transformations.
●Partitioned sources to improve session performance.
●Created flexible mappings/sessions using parameters, variables and heavily using parameter files.
●Improved session run times by partitioning the sessions. Was also involved heavily into database fine tuning (creating indexes, stored procedures, etc.), partitioning oracle databases.
●Derived various complex calculations as per the report specification.
Environment: Informatica Power Center 9.1.0, Oracle9i, Shell Scripting, SQL, UNIX, Windows 2000, XP and Window 7.