Post Job Free

Resume

Sign in

Data Informatica

Location:
Milwaukee, WI
Posted:
June 30, 2020

Contact this candidate

Resume:

AKHIL PUNNA 224-***-**** add754@r.postjobfree.com

Summary:

Around 8+ years of IT experience in Design, Analysis, Data Modeling and development of software applications and implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ELT, OLAP, Client/Server and Web applications using ETL/OLAP with Oracle, SQL Server, Netezza, Teradata databases on windows and UNIX platforms.

Experience in Data warehousing, Data Extraction, Transformation and loading (ETL) data from various sources like Oracle, SQL Server, Teradata, Netezza, XML Files and Flat files into Data Warehouse and Data Marts using Informatica Power Center v6.x, 7.x, 8.x, 9.x (Repository Manager, Mapping Designer, Workflow Manager, Mapplet Designer, Workflow Monitor, Source Analyzer, Transformations Designer, Warehouse Designer).

Extensively used Source Qualifier, Connected and Unconnected Lookup, Normalizer, Router, Filter, Expression, Aggregator, Stored Procedure, Java, Web services, Salesforce Lookup, SQL, Sequence Generator, Sorter, Joiner, Update Strategy, Union Transformations, Mapplets.

Expertise on working with Web Services Consumer transformation to fetch values from Web Services applications also experienced in using Soap UI for unit testing.

Experience with Informatica Power Exchange to extract/read data from IBM Mainframe sources such as DB2, COBOL files.

Well versed with various aspects of ELT processes used in loading and updating Teradata data warehouse.

Integrating various applications from Salesforce.com using REST Web service both Incoming and outgoing service to Web service cloud Emptoris.

Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs Type 1/Type 2/ Type 3/Type 4 loads.

Experience in using the Informatica Utilities like Pushdown optimization and Session Partitioning to improve the performance of Informatica mappings.

Having strong experience in using Teradata utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts

Proficient with Informatica Data Quality (IDQ) for cleanup and massage data at staging area.

Experience in integration of various data sources like Teradata, Oracle, PostgreSQL, HDFS, MS SQL Server, Flat Files, and XML Definitions.

Extensive experience using Informatica power center, Netezza, DB2, Oracle 10g/9i/8i, Teradata, NoSQL, UNIX Shell Scripting, Windows, and as well as External Sources like XML files, flat files, GDG files, COBOL (VSAM)sources.

Populate or refresh Teradata tables using BTEQ, Fast load, Multi load & Fast export utilities for loading history & incremental data into multiple databases.

Experience in Installation, Configuration, and Administration of Informatica Power Center/Power Exchange Client, Server and having exposure to FTP and Release Management.

Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Informatica ETL Components and Teradata Utilities, UNIX Scripts, SQL Scripts etc.

Worked on GITHUB to commit, migrate the code into different environments.

Knowledge about Informatica MDM, Cloud.

Strong Data Modeling experience in ODS, Dimensional Data Modeling Methodologies likes Star Schema, Snowflake Schema.

Good exposure to Mainframe Systems and knowledge in handling COBOL files.

Well versed in writing UNIX shell scripting.

Good in planning, organizing, prioritizing multiple tasks.

Interested in learning new technologies and willing to working changing environments.

Strong decision-making and interpersonal skills with result oriented dedication towards goals.

Technical Skills:

ETL Tool

Informatica Power Center 6.x, 7.x, 8.x, 9.x, Informatica Power Exchange, B2B Data Transformation

Databases

Oracle 9i/10g/11g, DB2, SQL Server, SSRS, T-SQL, Netezza, Teradata (Fast load, Multiload, Tpump and Fast export), PostgreSQL.

Data Modeling Tool

ERWIN, Toad Data modeler and MS Visio

Scripting Languages

UNIX Shell Scripting, Windows Batch scripting.

Operating Systems

Windows, LINUX, UNIX

Schedulers

Control-M, Autosys, Tivoli, ESP

Files

Flat files, COBOL files, XML files, WSDL files, XML Files, VSAM files, GDG files

Professional Experience:

S. C. Johnson & Son, Inc, Racine WI Oct 2019 to Present

ETL Designer/Data Modeler/Sr Informatica Analyst

Responsibilities:

Followed Agile Model and daily involved in Scrum meetings

Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.

Extensively used Power Center to design multiple mappings with embedded business logic.

Implemented Informatica mappings for extracting data from Data Lake to DWH

Involved in discussion of user and business requirements with business team.

Perform ETL development work during iterations. Participate in Iterations, including Scrum meetings, to understand functionality that will be deployed to Production. Complete assigned development during iterations and resolve defects.

Develop and maintain documentation according to Data Lake standards. Accountable for quality and timeliness of technical and business solution that meets Data Lake's business needs

Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.

Attended the meetings with business integrators to discuss in-depth analysis of design level issues.

Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement

Extracted data from the databases (Teradata and SQL Server, DB2, FLAT FILES) using Informatica to load it into a single data warehouse repository

Developed a Conceptual model using Erwin based on requirements analysis

Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.

Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.

UNIX wrapper for BTEQ executing the ELT processes for loading data into Confidential specific tables

Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.

Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.

Worked on data profiling & various data quality rules development using Informatica Data Quality.

Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.

Implemented Slowly Changing Dimension Type II for different Dimensions.

Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.

Worked with Informatica version Control excessively.

Experience in using SVN as version control for migration.

Implemented Exception Handling Mappings by using Data Quality, data validation by using Informatica Analyst.

Managed enhancements and coordinated with every release with in Informatica objects.

Provided support for the production department in handling the data warehouse.

Worked under Agile methodology and used Rally tool one to track the tasks.

Written thorough design docs, unit test documentation, Installation and configuration guide documents.

Performed bulk data imports and created stored procedures, functions, views and queries.

Environment: Informatica Power Center 10.2.0, Teradata 16.10.03.04, DB2, Flat Files, SQL Assistant, SQL Developer, Unix shell scripting, Windows, UNIX

Principal Financial Group, Des Moines IA July 2018 to Sep 2019

Sr. Informatica Developer/Data Analyst

Responsibilities:

The project followed Agile methodology; provide daily status to team

Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.

Communicated with business customers to discuss the issues and requirement

Ensuring solution built for performance and scalability.

Built logical Dimensional Model layer to satisfy the Business Requirements for users.

Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.

Importing source/target tables from the respective databases/Salesforce and created reusable transformations and mappings using Designer Tool set of Informatica.

Used Informatica PowerCenter to load data from different data sources like xml, flat files, Mainframe files, DB2, Oracle, Oracle DRM

Created versions in DRM with different hierarchies and blended the data.

Worked on all Hyperion Suite of Tools – including but not limited to Essbase, HFM, DRM, and FDMEE

Worked on replicating complex Hyperion reports in Informatica as a part of Hyperion rewrite when it was planned to retire the reporting tool HYPERION

Experienced in Developing Hyperion Essbase Cubes, Hyperion Planning Cubes Load Rules from text files and Oracle server

Created Sqoop scripts to ingest data from HDFS to Teradata and from SQL Server to HDFS and to PostgreSQL

Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and AutoSys scripts for scheduling the jobs (work flows)

Monitoring data quality and integrity end to end testing and reverse engineering and documented existing ELT program codes

Written Unix shell scripts for file manipulation, ftp and to schedule workflows.

Worked with the TWS (Tivoli Workload Scheduler) and Autosys scheduling team in scheduling Informatica Jobs as per the requirement and frequency

Troubleshoot Performance problems, fine-tune long running SQL Queries using Query analyzer, Execution Plan, SQL Profiler, Indexes, hints to make reports and ETL run smoothly and efficiently

Advanced planning/organizational, problem-solving, analytical, time management, decision-making, and communication skills, as well as solid leadership and basic presentation skills.

Environment: Informatica Power Center 10.1.1, Salesforce.com, Oracle SQL Developer, DB2, Oracle Hyperion, DRM, Teradata, PostgreSQL, Sqoop, Github, HDFS, Unix shell scripting, Perl scripting, VB Script, Tivoli scheduler, Essbase

Farm Bureau Financial Services, West Des Moines IA April 2018 to July 2018

Sr. Informatica Developer

Responsibilities:

Interacted with Requirements Analysts to gather information and to collect business requirements for developing Informatica mappings.

Followed Agile Model and daily involved in Scrum meetings

Written complex SQL queries for extracting the data from the DB2 database as per the requirements by partitioning for performance improvement and using those SQL's in Informatica Source Qualifiers.

Designed and developed complex mappings, reusable transformations for ETL using Informatica Power Center.

Worked on several transformations such as Filter, Joiner, Rank, Sequence Generator, Aggregator, Lookup, Expression, Union, Sorter, Router, SQL, Java, Web-service consumer transformations in Informatica.

Worked on designing and coding the complete ETL process using Informatica for various transactions and loading data from different sources like Flat Files, Web Services and Relational Database.

Developed Complex SQL queries using various joins and also extensively used various dynamic SQL's in stored procedures and functions.

Created Mapplets and reusable transformations to be re-used during the development life cycle.

Debugged the mappings extensively, hardcoded the test data to test the logics going instance by instance. Involved in unit testing and documented the testing results, workflows and jobs.

Created the design and technical specifications for the ETL process of the project

Develop SSRS Reports based modelled on legacy reports with enhancements.

Created param files for parameterizing the source/target file locations, file names and DB Connections and written Unix shell scripts in FileZilla for running the workflows in Putty

Performed Unit Testing and played an active role in moving the code to Staging, Testing, UAT and Production environments

Scheduled the jobs using Control-M scheduler by checking dependencies

Environment: Informatica Power Center 10.1.1, Informatica Power Exchange 10.1.1, SQL Server, DB2, SSRS, Web Services, Control-M, Windows, LINUX/UNIX, Shell scripting, FileZilla, Putty, Flat Files, WSDL Files, SOAP UI, XML Files

Nationwide Insurance, Des Moines IA Nov 2016 to March 2018

Sr. Informatica Developer/Salesforce

Responsibilities:

Working on a migration project where I have to design and decommission the existing Oracle to a Teradata database.

Interacted with business users, analysts to gather, understand, analyze, program and document the requirements of Data Warehouse.

Had experience in Agile environment by attending daily scrum meetings, bi-weekly sprint planning meetings, Retro's and Demo's

Interact with Business and have to create the data models, design them and create the ETL components.

Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.

Used various transformations like Source Qualifier, Application Source Qualifier, Normalizer, Joiner, connected Lookup, Unconnected Lookup, Filter, Router, Expression, SQL transformation, Update Strategy, XML parser, Web services consumer transformations, Mapplets etc.

Implemented ETL solutions to extract data from various sources like XML, WSDL, pdf files, Flat Files, Salesforce.com, and Teradata etc.

Worked in data migration project from Salesforce.com(pdf files) to Web services cloud(ACMS)

Involved in performance tuning of the existing mapping and workflows to improve running time by using push down optimization and pipelining in Informatica.

Worked on SFDC session log error files to look into the errors and debug the issue.

Validated all the web services by testing the WSDLs using SOAP UI

Worked extensively with the salesforce team to let them know the issues while loading the data into Salesforce and successfully pointed out the errors by analyzing the salesforce environment from our end.

Successfully completed the end to end testing and loading of data into salesforce environment.

Was involved in the tuning at the Source SQL query using Query Hints and also tuning at Informatica end using partitioning.

Modifying shell scripts to dynamically change the parameter variables used in mappings, start the ELT process, check for the success of the session and notify the status through email.

Design and Development of UNIX Shell Scripts.

Worked on Teradata TPT Loader to load the target tables

Created deployment plan and implementation plan for the change requests and got IT signoff on the same.

Developed shell scripts to handle file watcher, file transfer using (SCP, SFTP), trigger the process and sending email task based on file availability.

Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data cache size and target-based commit interval.

ESP Scheduler was used to schedule jobs on daily and hourly basis.

Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session

Worked on Deployment groups for code migration to multiple environments.

Environment: Informatica 9.6.1, Oracle, Teradata, TPT, FLoad, Mload, Netezza, Toad, Salesforce.com, Web Services, ESP Scheduler, Windows, LINUX/UNIX, Shell scripting, WinScp, Putty, Flat Files, WSDL Files, SOAP UI, XML Files, PDF files, GitHub

Horace Mann Educators Corporation, Springfield IL April 2016 to Oct 2016

ETL Informatica Developer

Responsibilities:

•Involved in installation of Informatica Power Center & Power Exchange

•Extensively used Informatica Power Center for extracting, transforming and loading data from relational sources and non-relational sources

•Provides ETL technical options as needed

•Worked extensively on Informatica Power Exchange to read IBM Mainframes DB2 source data, GDG files, SEQ files.

•Works with workstream and sprint team members to understand functional and non-functional requirements

•Creates detailed design specifications for ETL-related components

•Extensively used various transformations such as Source Qualifier, Expression, Lookup, Sequence Generator, Aggregator, Update Strategy, and Joiner while migrating data from various heterogeneous sources like DB2, XML, COBOL, GDG, and Flat files to SQL Server.

•Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets for Data Load.

•Setting up and Sessions and Workflows to schedule the loads at required frequency using Power Center Workflow manager.

•Created and modified COBOL copybook to connect source data using Power Exchange Navigator Monitoring the ETL jobs and fixing the Bugs.

•Worked with Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse using Source Qualifier Transformation of Informatica Power center.

•Analyzing the end to end system of the Data warehouse.

•Wrote T-SQL statements for retrieval of data.

•Implemented Error handling and Exception handling in Informatica Mapping Designer

•Understanding data quality scenarios from business and implement the same in Staging.

•Fixing operational issues of the jobs to ensure data currency up to date and meeting the business requirements.

•Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

•Developed database monitoring and data validation reports in SQL Server Reporting Service (SSRS)

•Performed Unit testing, Integration testing and System testing of Informatica mappings

•Building Reports according to user Requirement.

•Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

•Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

•Implemented slowly changing dimension methodology for accessing the full history of accounts.

•Participated in weekly status meetings and providing the status to the Project Manager on a daily basis.

•Develops ETL packages to target stores

•Used the Microsoft share point to upload the design and test cases documents

Environment: Informatica Power Center 9.1.0, Informatica Power Exchange 9.1.0, T-SQL, DB2, SQL Server, SSRS, Windows, GDG, SEQ, Cobol Files.

Mercury Insurance Group, Brea CA Mar 2015 to Mar 2016

Sr Informatica Developer

Responsibilities:

Analyze the source systems data and the current reports at the client side to gather the requirements for the Data warehouse Data Model.

Participating in daily scrum meetings and attending weekly keys meeting.

Involved in creating Informatica mapping to populate staging tables and date warehouse tables from various sources like flat files, XML files, DB2, SQL sources.

Analyzed the source data coming from SQL, DB2, XML, flat files and loading into Netezza warehouse. Worked with Data Warehouse team in developing Dimensional Model.

Developed and tested thoroughly Informatica mappings/workflows as per technical and functional specs provided.

Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformations like Source Qualifier, Java, Filter, Joiner, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Expression for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.

Worked mainly on maintenance bug fixes on existing Informatica Mappings to produce correct outputs.

Extensively used the reusable transformation, mappings and codes using Mapplets for faster development and standardization.

Created and configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.

Involved in Performance Tuning in Informatica for source, transformation, targets, mapping and session.

Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database.

Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on.

Used Unix Command and Unix Shell Scripting to interact with the server and to move flat files and to load the files in the server.

Responsible for file archival using Unix Shell Scripting.

Worked on Tivoli to schedule Informatica jobs.

Used DB visualizer to access the data base

Used the Microsoft share point to upload the design and test cases documents

Environment: Informatica 9.1, Netezza, nzsql, SQL, DB2, DB visualizer, Windows, UNIX, WinScp, Putty, Flat Files, XML Files, Tivoli, Hudson.

ICICI Bank Hyderabad Jun 2010 to Aug 2013

SQL Developer/SSIS, SSRS

Responsibilities:

Involved in analyzing and development of the Data Warehouse.

Involved in installing and maintaining MS SQL Server 2000/2005.

Created Data Model or the Marketing Information System Team using Erwin.

Created SSIS package for loading the data coming from various interfaces like OMS, Orders, Adjustments and Objectives and also used multiple transformation in SSIS to collect data from various sources.

Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.

Worked on the data warehouse design and analyzed various approaches for maintaining different dimensions and facts in the process of building a data warehousing application.

Scheduled and maintained daily and monthly loads of OMS, Orders, Objectives and Adjustments data through jobs, tasks, and alerts.

Created stored procedures to build Fact tables in data mart for Multi-Dimensional analysis using (SSAS) and produced ad-hoc, standard and create super user reports using (SSRS).

Using SSIS created OLAP cubes for data mining and created reports from OLAP cubes using SSRS.

Created components, tools, techniques, methods and procedures used in an on On-Line Analytical Processing (OLAP) environment for accessing and translating data into understandable and usable business information using SSAS.

Created SSIS packages for File Transfer from one location to the other using FTP task.

Created Database Objects - Tables, Indexes, Views,Stored Procedures and User defined functions according to the requirements of the project.

Using reporting services (SSRS) generated various reports.

Environment: SQL Server Management studio, SQL Server 2005, SSIS, SSAS, SSRS, T-SQL, BCP, SQL Jobs, ERWIN.



Contact this candidate