Post Job Free

Resume

Sign in

Data Manager

Location:
Birmingham, AL, 35244
Posted:
February 03, 2021

Contact this candidate

Resume:

Venki Kallem

adjwxz@r.postjobfree.com

205-***-****

PROFESSIONAL SUMMARY:

10+ years of experience in Data warehousing using Informatica as an ETL tool. Exposure to various domains like Banking, Health care, Insurance, Retail Industry and Financing.

Extensive testing ETL experience using Informatica (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor).

Conversant with all phase of Software Development Life Cycle (SDLC) involving System Analysis, Design, Development and Implementation. Relational Database Management System (RDBMS) concepts. Wide range of experience in Software development, Project management, Data Integration, Master Data Management and Quality assurance. Data Extraction, Transformation and Loading data using Informatica Power Center 10.x,9.6.1/9.5.1/9.1.1/8.x.

Expertise in Health care domain like Medicare, Medicaid and Insurances compliance within Health Insurance Portability and Accountability Act (HIPAA) regulation and requirement.

Worked on HIPAA standard formats like 837, 835, 277 and 999 Acknowledgement

Worked on tools like Repository Manager, Workflow Manager, Workflow Monitor, Designer consists of objects like Mapping Designer, Transformation Developer, Mapplet Designer, Source Analyzer and Target Designer, Admin console and Data Administration Designing complex mappings using Source Qualifier, Data masking, Update Strategy, Expression, Sequence Generator, Aggregator, Filter, Joiner, Java, Router, Connected Lookup, Unconnected Lookup, Normalizer, Rank, XML, Sorter, http Transaction Control and Union to load data into different target types from different sources like Flat files (Delimited files and Fixed width files) like text files or CSV files, XML, Oracle, IBM Mainframe, Cobol files, SQL server, DB2.

Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM.

Developed python scripts to handle incremental loads

3+ years of experience in Cloud platform (AWS).

Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).

Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters and AWS Cloud.

Designing and configuring the AWS Secure Notification Service (SNS) and Secure Email Service (SES) architecture of the solution and working with a client.

Wrote scripts in Python for Extracting Data from JSON and XML files.

Managing Slowly Changing Dimensions (SCD) Type 1, Type 2 and Type 3, Data Marts, OLAP, OLTP and Change Data Capture (CDC).

Pre-SQL and Post-SQL in session level. Data Cleansing, Data Staging, Data profiling, Error Handling, Session log files, Workflow log files and Performance optimization like Pushdown Optimization (PDO), session partitioning, which troubleshoots the bottleneck problems at different levels like source, target, mapping, session and system.

Have worked on Agile Methodology.

Worked on different databases like Oracle, DB2, Teradata, SQL Server, MS Access.

Data Warehousing/Data Modeling using Erwin, staging tables, stored procedures, Functions, cursors, Dimension tables, Fact tables, Surrogate Key, Primary keys, Foreign Keys, Star Schema, Snowflake Schema, Triggers and Normalization/Denormalization. Performance tuning like creating indexes at database level.

Experience in Scheduling Tools like Autosys, Control-M, and Workload manager.

Worked with versioning tools like Bitbucket, SVN, PC Based version control.

Knowledge on Teradata utilities like Multi Load, Fast Load, TPump, Fast Export, and BTEQ scripts.

Experience in using Oracle Development tools such as Tool for Oracle Application Development (TOAD).

Knowledge on Oracle utilities like SQL Loader.

Writing Simple/Complex SQL Queries using Sub queries and multiple table joins using left, right, inner joins.

Experience in UNIX Shell Scripting and batch scripting for parsing files.

Experience with R-server to execute R-programs.

Experience in using the Informatica command line utilities like PMCMD, PMREP to execute workflows.

Experience on Used IDQ (Informatica Data Quality) as data Cleansing tool to create the IDQ Mapplets and to correct the data before loading into the Target tables.

Creating profiles on the Source table to find the Data anomalies using IDQ.

Experience in ETL technical documentation.

TECHNICAL SKILLS:

ETL/ETM: Informatica Power Center 10.x/9.x/8. x., Informatica Developer 10.x/9.6/10.1,

Informatica Analyst, Informatica Cloud.

Databases: Oracle 11g/10g, SQL Server 2014/2012/2008/R2/2003/R2, DB2/UDB, MS Access,

Teradata V13/V2R12, Amazon Web Services (AWS), SalesForce.

Methodologies: Star schema and Snowflake Schema

Operating system: UNIX, Shell scripting, Windows Server, Linux

Tools: Putty, WinSCP, Toad, AQT, Autosys, Control-M, Workload Manager, ClearCase,

SVN, Bit bucket, GIT, Erwin, Quality control, JIRA

Web Design: PHP, HTML, Python

Utilities: SQL Loader, Multi Load, Fast Load, Fast Extract, TPump, BTEQ, PMCMD, PMREP,

SQL Assistant

BBVA Compass, Birmingham AL (Feb 2020– Till Date)

Sr. ETL Informatica Developer

Customer Profitabilitys

BBVA Compass is a leading U.S. banking franchise with operations throughout the Sunbelt region and it ranks among the top 25 largest banking sector in U.S. It has been recognized as one of the nation's leading Small Business Administration (SBA) lenders. This project is developed for maintaining a catalogue of all the products that are offered online to the customers. It involved extraction of data from multiple Source systems and loading them through an ETL process into staging tables and then into Target tables.

Responsibilities:

Conducted technical design presentation to the client and getting the sign off.

Designed and developed ETL Mappings, Mapplets, Workflows, Worklets using Informatica Power center 10.x, 9.x.

Developed test cases based on test matrix including test data preparation for Data Completeness, Data Transformations, Data quality, Performance and scalability.

Developed Test Cases and SQL Queries to perform various Validations.

Designed and built integrations supporting standard data warehousing objects (type-2 dimensions’ aggregations, star schema, etc.)

Develop python scripts to separate the various Rec types provided by the users.

Developed a python script to initiate a web service call that will further extract the operational data in XML form and load it into the SQL tables.

Created Design Documents for source to target mappings. Developed mappings to send files daily to AWS.

Worked on Agile Methodology with 2 Week sprint.

Design Develop and Implement Control-M jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.

Designed mappings using different transformations such as Data Masking, Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Designer.

Worked on data cleansing using the cleanse functions in Informatica MDM.

Analyzed different data sources like SQL Server, Oracle, XML files and understand the relationships by analyzing the OLTP Sources and loaded into Oracle.

Hosted and lead Scrum meetings and followed up with respective business owners to effectively deliver the data.

Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of data.

Developed SQL sub queries, joins between multiple tables, Procedures and functions.

Created Triggers, Views and Synonyms to support Front end users.

Worked on Design Error Handling process in ETL.

Created Shell script to monitor system space and Used cron utility to schedule the shell scripts

Worked on XML’s, XML parser and HTTP transformation within Informatica.

Explored in technical specifications to design and develop the ETL mappings.

Demonstrated technical concepts to senior level business stakeholders.

Provided end to end technical guidance on the software development life cycle (requirements through implementation).

Solved the issues based on the Data analysis as per the business requirement.

Discussed in detail on Data Architecture / Data lineage methodologies.

Data modeled and understood the key business elements.

Created Batch scripts, Power shell scripts

Dealt with multiple deadline-driven, customer-sensitive projects and tasks.

SVN (Sub Version Tortoise) /Bit bucket were used to maintain ETL and PL/SQL code changes.

Autosys was used to schedule daily, hourly bases ETL mappings and PL/SQL jobs.

Worked on UNIX to clean up session log files, create parameter file and prepare shell scripts.

Closely moved with the reporting team and helped them to get the data for creating report.

L3 support was used to rectify Production issues based on severity.

Worked on Agile Methodology.

Environment: Informatica Power Center 10x, Informatica MDM 10.1, Oracle12g, SQL Server 2014, Autosys, AWS, Redshift, Quality Control, JIRA, Wikipage, SharePoint, Bit bucket, Git.

PROFESSIONAL EXPERIENCE:

Humana, Louisville KY (July 2019– Feb 2020)

Sr. ETL Developer/Informatica Cloud

Dental Network

Humana Inc. is a for-profit American health insurance company based in Louisville, Kentucky. As of 2014 Humana had over 13 million customers in the U.S., reported a 2013 revenue of US$41.3 billion, and had 41,600 employees.

Responsibilities:

Load/extract data from multiple cloud applications (primarily Salesforce.com) and legacy applications (e.g., Oracle)

Provide recommendations and expertise of data integration and ETL methodologies.

Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.

Worked on various HIPAA terms &functionalities.

Worked on HIPAA standard formats like 837, 835, 277 and 999 Acknowledgement.

Design, Development, Testing and Implementation of ETL processes using Informatica Cloud

Convert extraction logic from data base technologies like Oracle, SQL server, DB2

Data cleansing prior to loading it into the target system

Convert specifications to programs and data mapping in an ETL Informatica Cloud environment

Designed and developed ETL Mappings, Mapplets, Workflows, Worklets using Informatica Power center 10.x, 9.x.

Designed and built integrations supporting standard data warehousing objects (type-2 dimensions’ aggregations, star schema, etc.)

Created Design Documents for source to target mappings. Developed mappings to send files daily to AWS.

Worked on Agile Methodology with 2 Week sprint.

Developed Python Scripts to parameterize the sessions parameters during the decomposition.

Design Develop and Implement Control-M jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.

Designed mappings using different transformations such as Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Designer.

Environment: Informatica Cloud/Informatica Power Center 10x, Oracle12g, SQL Server 2014, Autosys, AWS, Redshift, Quality Control, JIRA, Wikipage, SharePoint, Bit bucket, Git.

BBVA Compass, Birmingham AL (Jan 2018– June 2019)

Sr. ETL Informatica Developer

FRAUD Data Mart

BBVA Compass is a leading U.S. banking franchise with operations throughout the Sunbelt region and it ranks among the top 25 largest banking sector in U.S. It has been recognized as one of the nation's leading Small Business Administration (SBA) lenders. This project is developed for maintaining a catalogue of all the products that are offered online to the customers. It involved extraction of data from multiple Source systems and loading them through an ETL process into staging tables and then into Target tables.

Responsibilities:

Conducted technical design presentation to the client and getting the sign off.

Designed and developed ETL Mappings, Mapplets, Workflows, Worklets using Informatica Power center 10.x, 9.x.

Developed test cases based on test matrix including test data preparation for Data Completeness, Data Transformations, Data quality, Performance and scalability.

Developed Test Cases and SQL Queries to perform various Validations.

Developed Python Scripts to parameterize the sessions parameters during the decomposition

Designed and built integrations supporting standard data warehousing objects (type-2 dimensions’ aggregations, star schema, etc.)

Created Design Documents for source to target mappings. Developed mappings to send files daily to AWS.

Worked on Agile Methodology with 2 Week sprint.

Design Develop and Implement Control-M jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.

Designed mappings using different transformations such as Lookup’s (connected, Unconnected, Dynamic), Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Designer.

Analyzed different data sources like SQL Server, Oracle, XML files and understand the relationships by analyzing the OLTP Sources and loaded into Oracle.

Hosted and lead Scrum meetings and followed up with respective business owners to effectively deliver the data.

Worked on Slowly Changing Dimension (SCD) Type 1 and Type 2 to maintain the full history of data.

Developed SQL sub queries, joins between multiple tables, Procedures and functions.

Created Triggers, Views and Synonyms to support Front end users.

Worked on Design Error Handling process in ETL.

Created Shell script to monitor system space and Used cron utility to schedule the shell scripts

Worked on XML’s, XML parser and HTTP transformation within Informatica.

Explored in technical specifications to design and develop the ETL mappings.

Demonstrated technical concepts to senior level business stakeholders.

Provided end to end technical guidance on the software development life cycle (requirements through implementation).

Solved the issues based on the Data analysis as per the business requirement.

Discussed in detail on Data Architecture / Data lineage methodologies.

Data modeled and understood the key business elements.

Created Batch scripts, Power shell scripts

Dealt with multiple deadline-driven, customer-sensitive projects and tasks.

SVN (Sub Version Tortoise) /Bit bucket were used to maintain ETL and PL/SQL code changes.

Autosys was used to schedule daily, hourly bases ETL mappings and PL/SQL jobs.

Worked on UNIX to clean up session log files, create parameter file and prepare shell scripts.

Closely moved with the reporting team and helped them to get the data for creating report.

L3 support was used to rectify Production issues based on severity.

Worked on Agile Methodology.

Environment: Informatica Power Center 10x, Oracle12g, SQL Server 2014, Autosys, AWS, Redshift, Quality Control, JIRA, Wikipage, SharePoint, Bit bucket, Git.

BBVA Compass Bank, Birmingham AL (Aug 2015– Dec 2017)

ETL/Informatica Developer

SLMS and SBO2000

BBVA Compass is a leading U.S. banking franchise with operations throughout the Sunbelt region and it ranks among the top 25 largest banking sector in U.S. It has been recognized as one of the nation's leading Small Business Administration (SBA) lenders.

Responsibilities:

Understood the Business requirements and converted them into appropriate technical requirement document.

Designed mappings using different transformations such as Lookup’s, Expression, Joiner, Router using Informatica Power center and Informatica Developer.

Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.

Worked on Agile Methodology with 2 Week sprint.

Analyzed different data sources like SQL Server, Flat files including Delimited and Fixed width like text files, XML files from where the contract data and billing data is coming from and understood the relationships based on analyzing the OLTP Sources and then loaded them into Oracle DB.

Created and configured workflows, Worklets and sessions using Informatica Workflow Manager.

Worked on Design Error Handling process in ETL.

Created reusable tasks at workflow level.

Created reusable transformations at mapping level.

Prepared an ETL technical documents and Data quality rule Specification documents.

Based on ETL loads we had created the PDOs, Profiles and Scorecards to check the Data quality standards.

Environment: Informatica Power Center 9.6.1, Informatica Developer10.1, Informatica Analyst, Oracle 11g, SQL Server 2012/2014, Sas Developer 5.1, Version Control, JIRA.

AUX SYSTEMS INC, Newburgh, NY (Feb 2015 – July 2015)

Informatica Developer/Data Masking

Responsibilities:

Analyzed business process and gathered core business requirements. Interacted with business analysts and end users.

Prepared a handbook of standards for Informatica code development.

Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.

Developed Custom metadata repository.

Designed mappings and mapplets to extract data from SQL Server, Sybase and Oracle sources.

Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.

Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level.

Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.

Debugger was used to test the data flow and fix the mappings.

Created and monitored the workflows and tasks using Workflow Manager.

Partitioned Sessions for concurrent loading of data into the target tables.

Tuned the workflows and mappings.

Identified the bottle necks and used tuning to improve the Performance.

Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.

Executed Workflows and Sessions using Workflow Monitor.

Dealt with data issues in the staging flat files and after it was cleaned up it is sent to the targets.

Actively coordinated with QA team in the testing phase and helped the team to understand the dependency chain of the whole project.

Executed the workflow using pmcmd command in UNIX.

Environment: Informatica Power Center 9.5.1, Oracle 11g, Netezza, Teradata v13, Teradata utilities Windows server 2012.

Prime Ki Solutions, Hyderabad, Ind (June 2010 – Nov 2013)

ETL/Informatica Developer

Responsibilities:

Drafted Software Requirement Specifications for the project.

Interacted with business representatives for requirement analysis and to define business and functional specifications.

Coordinated with Informatica Admin to setup the environment and to move objects to different repositories and folders.

Worked on tools such as Source Analyzer, Target designer, Mapping Designer, Workflow Manager, Mapplet Designer and Transformation Developer.

Designed and created mappings, sessions and workflows and scheduled the workflows as per the Functional & Technical specifications.

Extracted the data from Oracle10g, Netezza, XML, Flat files, DB2 loads the data in to Oracle warehouse.

Worked on batch scripting.

Designed mappings using different transformations such as Lookups, Filter, Expression, Update strategy, Joiner, Source Qualifier and Router for populating target tables as per business requirements.

Worked with Change Data Capture (CDC).

Worked with Shortcuts across Shared and Non-Shared Folders.

Created and Executed workflows and Worklets using Workflow Manager to load data into the Target Database.

Worked on scheduling tool and versioning tool.

Worked on Oracle utilities.

Analyzed Session log files to resolve errors in mapping and managed session configuration.

Created, configured, scheduled and monitored the sessions and workflows based on run on demand, run on time using Workflow Manager.

Performance tuned at various levels which includes Target, Source, Mapping and Session for large data files.

Used mapping variables and parameters for reusability of the code.

Maintained naming standards and warehouse standards for future application development and also created functional and technical specification documents.

Designed snow flake schema, star schema, dimension tables, fact tables.

Worked with Surrogate keys, primary keys, foreign keys, triggers, normalization/DE normalization types.

Worked on Agile methodology.

Environment: Informatica Power Center 8.6.0, Oracle10g, DB2, PL/SQL, SQL loader, Toad, PL/SQL Developer tool, workload manager, ClearCase, Windows server 2008, AQT, batch scripts, FTP.

EDUCATION: Master of Science in Computer Science in 2014 from Northwestern Polytechnic University

Bachelor of Technology in Computer Science in 2009 from JNTU Hyderabad.



Contact this candidate