Post Job Free

Resume

Sign in

Data Manager

Location:
Peoria, IL
Posted:
June 18, 2018

Contact this candidate

Resume:

Manogna

ac5w3w@r.postjobfree.com

732-***-****

SUMMARY:

Around 8 Years of IT experience in developing/Testing/Validating Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Power Exchange, HP Quality Centre, ALM etc.

Experience in Installation, Configuration and Administration of Informatica Power center 7.x/6.x /8.x/9.x Client, Server in Windows and UNIX.

Full Software Development Life Cycle (SDLC) experience, involved in requirement modelanalysis, design, development, testing, and maintenance with working experienced in Agile, Scrum and Waterfall environments.

Highly experienced in Extraction/Transformation/Loading of the legacy data to Data warehouse using ETL Tools.

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Integration, Data Import and Data Export using multiple ETL tools such as Informatica Power Center and Data Stage.

Experience working with SDLC, RUP, Waterfall and Agile methodologies

File mapping form Source-Target. Vendors had different platform and had to overcome challenges to convert proprietary data to EDI and EDI to proprietary text files.

Involved in Test planning and Effort estimation. Responsible for Test status reporting and documentation. Single Point Responsibility for releases, defects fixes.

Excellent understanding of ETL, Dimensional Data Modeling techniques, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts - Star and Snowflake schemas, Fact and Dimension tables, Surrogate keys, and Normalization/Denormalization.

Designing and executing unit test plans and gap analysis to ensure that business requirements and functional specifications are tested and fulfilled.

Responsible for Financial workstream in R12.2.3 in implementation, support and enhancement.

Used various Transformations such as Expressions, Filters, aggregators, Lookups, Routers, Normalizer, and Sequence Generator etc. to load consistent data in to Oracle, and Teradata databases.

Having good knowledge in Normalization (1NF, 2NF and 3NF) and De-Normalization techniques for optimum performance on XML data, Relational and Dimensional databases environment

Expertise in SQL/PLSQL programming, developing & executing Packages, Stored Procedures, Functions, Triggers, Table Partitioning, Materialized Views.

Having good experience in Teradata sql assistant. Extracted data from Teradata and loaded into various other target systems. Also used Teradata as target instances.

The interfaces were built using EDI 834 transactions for eligibility and 820 for payment files.

Eextensive success in translating business requirements and user expectations into detailed specifications employing Unified Modelling Language (UML)

Adhered to strict compliance, policies/regulations configured Facets modules such as Membership, Benefit and plan

Experience in Performance Tuning of sources, targets, mappings, transformations and sessions and experience in creating CDC Datamaps using Power exchange and achieved incremental loading into target tables.

Designed both 3NF data models for ODS, OLTP systems and dimensional data models using Star and Snow flake Schemas.

Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.

Involved in working with Power Exchange sources and extracted non-relational data into Informatica via Data maps etc.

Experience with Type 1, Type2, Type3 Dimensions

Experience with Teradata as the target for the data marts. Worked with BTEQ, Fast Load and Multi Load.

Experience in Integration of various data sources like Oracle, Teradata, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV and excel.

Develop Proprietary and HIPPA Compliance Backend Applications to Parse and Edit incoming x12 and proprietary Transactions and subsequent re-routing of response to appropriate requestor

Some of the files were mapped to HIPPA gateway and finally loaded in Facets.

Expertise in error handling and reprocessing of error records while extraction and loading of data into enterprise ware house objects.

Good hands on creating, modifying and implementing Unix shell scripts for running Informatica workflows, preprocessing and post processing validations etc.

Good knowledge on schedule jobs in Autosys/Control-M and defining clear and perfect interdependency between the jobs to achieve what we needed.

Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

TOOLS

ETL Tools

Informatica 6.x/7.x/8.x/9.x, Power Exchange 8.6.1, Datastage8.1, Datastage8.5

Data Modeling

Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling Using Erwin Tool.

Testing Tools

HP Quality Center, Silk Test, SCM tools etc.

RDBMS

Oracle 11g/10g/9i/8.x, SQL Server 2008/2005/2000, MySQL, DB2, MS Access

Database Tools

TOAD, SQL*PLUS, SQL Developer, SQL*Loader, Teradata SQL Assistant

Languages

SQL (2012 and 2014), PLSQL, Shell Scripting, C, C++, PERL, PYTHON.

Operating Systems

MS-DOS, Windows7/Vista/XP/2003/2000/NT, UNIX, AIX, Sun Solaris

Caterpillar, Peoria, IL April’17- Till Date

Sr. ETL Developer

Caterpillar is the world's leading manufacturer of construction and mining equipment, diesel and natural gas engines, industrial gas turbines and diesel-electric locomotives. The ETL project is to support the portal application which enable the business users to make the right decision

Identify the sources and analyze the source data.

Created and stored metadata in the repository using Informatica Repository Manager.

Cleanse the source data, Extract and Transform data with business rules, and built re-usable mappings, using Informatica PowerCenter Designer.

Involved in creating, editing, scheduling and deleting of sessions using workflow manager in Informatica

Worked with Cognos Developers during requirements gathering

Implemented various Data Transformations using slowly changing dimensions

Involved in Installations, configurations of Informatica, OBIEE and DAC across multiple environments in Linux.

Monitored workflows and collected performance data to maximize the session performance

Applied the rules and profiled source and target data by using IDQ.

Provided fixes for critical bugs and assisting code moves to QA.

Extensively worked with SQL queries. Created Stored Procedures, packages, Triggers, Views using PL/SQL Programming.

Translating user inputs into ETL and Siebel Analytics design docs.

Write complex Hive queries to extract data from heterogeneous sources (Data Lake) and persist

the data into HDFS

Proficient in the Integration of various data sources with multiple relational databases like Oracle 11g, MS SQL Server, DB2, VSAM files and flat files into the staging area, ODS and data mart.

Mainframe Experience

Used Teradata utilities fast load, multiload, t pump to load the data.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Created Logical/Physical Data models in 3NF in the Warehouse area of Enterprise Data Warehouse.

Optimized the performance of the Informatica mappings. Configured session properties and target options for better performance.

Extensively used Siebel Analytics Administration Tool for customizing and modifying the physical, business and presentation layers of metadata repository.

Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues

Worked on analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hive, HBase and Sqoop.

Create the system to extract, transform and load market data, check correctness of data loading, use UNIX Korn shell and Perl, Oracle stored procedures.

Deployments of Informatica, DAC, OBIEE across DEV, CIT and PRD environments.

Implemented the error logging and monitoring of the tables and databases by creating the DML/DDL triggers in TSQL.

Validated complex mappings involving Filter, Router, Expression, Lookup, Update Strategy, Sequence generator, Joiner and Aggregator transformations.

Reading business requirements, write ETL logic to extract SAS datasets, and build reports.

Experience with Mainframe Technologies COBOL 85, JCL, VSAM and DB2

Worked with various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression

Expertise in defining and documenting the business process flows (UML) like Use case diagrams, Activity diagrams, Sequence diagrams and Data Flow Diagrams.

Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.

Created mapping variables and parameters and used them appropriately in mappings.

Extensively used all the features including Designer, Workflow manager and Repository Manager, Workflow monitor.

Used Informatica to load the data to Teradata by making various connections to load and extract the data to and from Teradata efficiently.

Strong expertise in using ETL Tool Informatica Power Center 8.x/9.6.1 (Designer, Workflow Manager, Repository Manager), Informatica Cloud, Informatica MDM 10.1, Informatica Data Quality (IDQ) 9.6.1 and ETL concepts.

Build and maintain complex statistical SAS routines using macros.

Employed various complex TSQL functionality such as CTEs and pivots in stored procedures and functions to represent data as per business requirements

Designed and developed several ETL scripts using Informatica, UNIX shell scripts

Performed Unit testing and Integration testing on the mappings. Worked with QA Team to resolve QA issues.

Developed banking management scripts in python to support the chase website in creating user profiles, transactions for the withdrawals and deposit.

Did QA of ETL processes, migrated Informatica objects from development to QA and production using deployment groups.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.

Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.

Involved in migration of the mapping from IDQ to Power center

Involved in conducting and leading the team meetings and providing status report to project manager.

Environment: Informatica PowerCenter 9.6.1, IDQ 9.6.1, Oracle 12c, MySQL, Agile, JIRA, Oracle 12C, DB2, UML, MS SQL Server 2005, Erwin, TOAD, korn Shell scripts, PERL, PYTHON, Ultra Edit, MS SQL 2016, WinSQL, Autosys, Tidal Scheduler, Shell Scripting, Putty, WinSCP, Notepad++, EDI 834, 820, Facets 5.0., JIRA, Cognos 10.x.

Wells Fargo,

Role: Sr. ETL Developer Feb’16- March’17

Work Location: San Francisco, CA

Wells Fargo & Company is an American international banking and financial services holding company headquartered in San Francisco, California. It is the world's second largest bank by market capitalization and the third largest bank in the U.S. by assets.

Developer and interacted with Business Analyst to understand the business requirements &Involved in analyzing requirements to refine transformations.

Provided technical guidance for re-engineering functions of Oracle warehouse operations.

Collaborated with architects to align the ETL design to the business case and the overall solution.

Developing the UNIX shell scripts for master workflows.

Design and develop new enhanced functionality for existing applications.

Worked on analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hive, HBase and Sqoop.

Created the ETL source to target mapping documents working with the business Analysts.

Designed, developed, and maintained Enterprise Data Architecture for enterprise data management including business intelligence systems, data governance, data quality, enterprise metadata tools, data modeling, data integration, operational data stores, data marts, data warehouses, and data standards.

Involved in migration of the mapping from IDQ to Power center

Responsible for Financial workstream in R12.2.3 in implementation, support and enhancement.

Establish OBIEE 10g to 11g upgrade strategy and documented step by step upgrade process.

Extracted data from Oracle, Excel and flat files into SAS Datasets.

Maintenance and support responsibilities related to Base SAS along with other troubleshooting tasks.

Worked on basic Siebel administration activities like compilation, srf and repository migration activities.

Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.

Analyzed existing SAS scripts based of off various platforms such as UNIX, Windows and Mainframe.

Extracted data from Oracle, Excel and flat files into SAS Datasets.

Worked with Informatica power center 9.5 to extract the data from IBM Mainframes DB2 sources into Teradata.

Created design standards of the ETL code by applying SCD logics.

Implemented the error logging and monitoring of the tables and databases by creating the DML/DDL triggers in TSQL.

Extensively worked on eScript to manage customization of Siebel application.

Created Parameter files, mapplets, and worklets for reusability in the code.

Reviewed and maintained the ETL coding standards.

Maintained an understanding of XML, XSD, DOM/SAX parsing, XPath and XSLT

Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.

Worked on the Persistent cache options where ever required to re-use the cache for larger table lookups.

Employed various complex TSQL functionality such as CTEs and pivots in stored procedures and functions to represent data as per business requirements

Installed Hadoop, MapReduce, HDFS, and developed multiple MapReduce and spark jobs in PIG and Hive for data cleaning and pre-processing.

Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using Erwin tool.

Coordinated the OBIEE RPD and Web Catalog objects migration from Dev environment to QA environment and to the Production environment.

Integrate Informatica MDM and IDQ to profile data and finalizing match rules for MDM hub.

Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.

Extensively used transformations like router, lookup (connected and unconnected), update strategy, source qualifier, joiner, expression, stored procedures, aggregator and sequence generator transformation.

Created Oracle stored procedures for capturing the ETL run statistics for the daily delta loads.

Performance tuned the database stored procedures and changed the updates to deletes and inserts to reduce the costly operations on the database.

Importing and exporting data into HDFS and Hive using Sqoop.

Used performance tuning at the session level, mapping level, and at the database level.

Used the ‘Organize On’ Option in the Netezza tables for frequently joined tables.

Creating DB Objects using Best practices to avoid data skews on the Objects.

Performance tuning of SQL scripts.

Created and executed the unit test plans based on system and validation requirements.

Worked on the migration of the code from DEV, QA, UAT, PROD using XML migrations.

Effectively communicate project expectations to team members in a timely and clear fashion

Environment: Informatica 9.1/9.6.1, Teradata, DB2, Oracle 12c, Agile, JIRA, IDQ, MySQL, PERL, PYTHON, MS SQL Server, UML, Cognos, ERWIN 9.5.02, Teradata SQL Assistant, Toad 12.1.0, Agility Workbench 4.3, flat files, XML files.

Mayo Clinic Dec 14-Feb 16

Role: ETL Developer

Work Location: Rochester, MN

Mayo Clinic is a nonprofit medical practice and medical research group based in Rochester, Minnesota. It employs more than 4,500 physicians and scientists and 57,100 allied health staff. It is widely regarded as one of the United States' greatest hospitals and ranked No. 1 in the country on the 2016–2017 U.S. News & World Report List of "Best Hospitals" of the United States.

Responsibilities:

Extensively developed various Mappings that incorporated business logic, which performed Extraction, Transformation and Loading of Source data into OLAP schema.

Extensively used Transformations like Source Qualifier, Expression, Lookup, Update Strategy, Aggregator, Stored Procedure, Filter, Router, Joiner etc. Implemented Lookup transformation to update already existing target tables.

Extensively used Re-Usable Objects like Mapplets, sessions and Transformations in the Mappings.

Developed sequential, concurrent sessions and validated them. Scheduled these sessions by using Workflow manager.

Responds to inquiries regarding EDI issues with enrollment, claims, payments, and/or clearinghouse activities

Troubleshoot Medical X12 transactions and Laboratory Workflow

Develop Proprietary and HIPPA Compliance Backend Applications to Parse and Edit incoming x12 and proprietary Transactions and subsequent re-routing of response to appropriate requestor

Used Teradata utilities fast load, multi load, t pump to load the data.

Decreased the run time of the existing SAS programs by eliminating redundant DATA steps by using WHERE statements instead of IF statements and using PROC FORMAT to format SAS variable values instead of multiple IF statements.

Master Data Management MDM Data Integration concepts in large scale implementation environments.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Maintenance and support responsibilities related to Base SAS along with other troubleshooting tasks.

Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.

Used Cigna's application to setup the group and subscriber data in the FACETS application.

Tuned target, source, transformation, mapping and session to increase Session performance.

Understanding the Reporting requirements.

Created Extracts in and out of Facets system.

Experience with Mainframe Technologies COBOL 85, JCL, VSAM and DB2

Design, develop, testing and implementation of transition from a legacy system to Trizetto's Facets enterprise solution

MDM development includes creating Base object tables, staging tables and landing tables as per requirement in LLD

Developed Stored Procedures and test the application with Toad. Developed PL/SQL Procedures, Functions to build business rules which are helpful to extract data from source and load data to target.

Developed several IDQ complex mappings in Informatica a variety of Power Center

Involved in System Integration Testing to check whether all the applications [Mainframe, ETL & Databases] are working fine as an end-to-end process.

Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.

Have been using Change Man as versioning control tool to keep track of Mainframe components like JCL, PROGRAM, PROCEDURES etc.

Extensively used debugger to find out errors in mappings and later fixed them.

Run and control workflows using the pmcmd.

Created User exit features extending the functionality and features of MDM HUB.

Proactively evaluated the quality and integrity of data by unit test and System test for Informatica mappings according to the business needs.

Adhered to strict compliance, policies/regulations configured Facets modules such as Membership, Benefit and plan

Managed open defects and brought them to closure by working closely with Developers during SIT, UAT and Performance testing.

Involved actively in creating test data in for performance testing using various data generator tools.

Writing different test cases for business rules validations during UAT phase and logging the defects in Quality Center.

Environment Informatica Power Center 9.1.0, Teradata, IDQ, Agile, JIRA, DB2, UML, EDI 834, 820, Facets 5.0., MS SQL Server, PERL, PYTHON, Oracle, Flat files, MySQL, Autosys, ALM tools for Testing and Data Generator internal tool of client.

Exilant Technologies, INDIA Feb 12 – Nov 14

Role: ETL Developer

Created a data warehouse to align the HR strategy to the overall business strategy. It can present an integrated view of the workforce and help in designing the retention scheme, improve productivity and curtail costs. This included extraction of data from different platforms, metadata management, and integration of mapping and loading to target tables.

Responsibilities:

Worked with Business Analyst in requirements gathering, business analysis and project coordination.

Responsible for developing complex Informatica mappings using different transformations.

Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

Created and Monitored Sessions and various other Tasks such as Event-Raise Task, Event-Wait Task, Decision Task, Email Task, Assignment Task, Command Task etc. using Informatica Workflow.

Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and usage of workflow variables for triggering emails.

Responsible for tuning the Informatica mappings to increase the performance.

Implemented complex ETL logic using SQL overrides in the source Qualifier.

Performed Unit tests development work and validates results with Business Analyst.

Developed Unix Scripts for updating the control table parameters based on the environments.

Responsible for providing written status reports to management regarding project status, task, and issues/risks, testing.

Anlayzing requirements to create test cases and getting approval from client for execution.

Used Defect Tracking tools such as ATLAS etc for proper management and reporting of defects identified.

Written various SQL’s for validating test data and production data from sources systems before loading for performance testing/ UAT.

Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.

Environment: Informatica 8.6.1, Oracle 10g, SQL, PL/SQL, MySQL, Teradata, TOAD, Shell Scripts, UNIX (AIX), Autosys, XSLT, MQ Migration Tools, Defect Tracking Tools (ATLAS).

ICICI Bank Jun’10 –Feb’12

Role: ETL Developer

Location: Hyderabad, India

Responsibilities:

Developed ETL and source to target mappings.

Developed the transformation/business logic to load data into data warehouse.

Created Several Informatica Mappings to populate the data into dimensions and fact tables.

Involved in the development of Informatica mappings and tuned them for optimum performance, Dependencies and Batch Design.

Identify the Fact tables and slowly changing dimensional tables.

Extensively used ETL to load data from multiple sources to Staging area (Oracle 9i) using Informatica Power Center Worked with pre- and post-sessions, and extracted data from Transaction System into Staging Area.

Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.

Used Debugger to test the data flow and fix the mappings.

Tuned the workflows and mappings.

Used Informatica designer for designing mappings and mapplets to extract data from Oracle sources.

Used Various Transformations like Expression, Filter, Router, Joiner, Look Up, and Update strategy, Source Qualifier in many mappings.

Designed Complex mappings for Slowly Changing Dimensions using Lookup (connected and unconnected), Update strategy and filter transformations for retaining consistent historical data.

Written PL/SQL procedures for processing business logic in the database.

Query based optimization, Cost based optimization.

Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.

Executed Workflows and Sessions using Workflow Monitor.

Environment: Informatica Power Center V8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), SQL, PL/SQL, MS SQL Server, Oracle 11g/10g, DB2, Flat files, Shell Scripting, UNIX, Mainframes, Windows.



Contact this candidate