Post Job Free

Resume

Sign in

Data Informatica

Location:
Deer Park, NY
Posted:
December 28, 2020

Contact this candidate

Resume:

Vineela KANAPARTHI

Contact: 720-***-****, Email: adi0f4@r.postjobfree.com

Career Summary:

Over 10 yrs. of Practical experience in Banking, Health Care and Pharmaceutical industries. Experience in design and development of high quality business applications with major focus on Data Warehousing and Business Intelligence.

Experience in all the main planning phases of a data-warehousing project namely Project Strategy, Project Scope, Project Analysis, Project Design, Project Build, Project Production phases.

Expertise in ETL processing using Power Center 10.2/9.5.1HF2/9.0.1HF4/8.1.1SP5 and Erwin Data modular.

Hands-on experience on Informatica Administration, Installations and Upgrades of Informatica Power Center 9.1.1 HF2/ 9.0.5 HF4 Client and the Server.

Involved in mapping specifications, documentation, design and implementation of Data Warehouse fact and dimensional tables.

Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.

Worked on Data Analysis to analyze and evaluate data/information gathered from multiple sources and reconcile/address conflicts or business issues.

Interpret data from primary and secondary sources using statistical techniques and provide ongoing reports.

Involved in documenting the Data Lineage using Collibra, MDR and building the Data Model using the Data Modeling tool Erwin.

Actively Involved in Data Cleaning, Data Analysis and Database design.

Experience in Performance tuning the Mappings & SQL’s to increase the performance.

Extensively worked on Data Integrations and Acquisitions by loading the data in various Dimension tables, Operational Data store tables using Informatica Power Center 10.2/9.5.1HF2/9.0.1HF4 from various relational and non relational Database source systems, JD Edwards mainframe system, Microsoft SQL Server2008, Siebel, MS-Access and Flat Files.

Hands on experience on developing the Packages using Microsoft SQL server components SSIS, SSRS & SSAS.

Converted the complete AS400 RPG process to Informatica by building the mappings and workflows.

Worked on Requirement Gathering, Technical Specifications and implemented the Data Lineage process.

Facilitated in weekly Sprints in grooming, Sprint Planning, review backlogs for Sprints using JIRA, Confluence, Share Point and Agile methodology.

Involved in Release Management Process for creation, update and approval of documents (procedures, work instructions/protocols) to support the ITSM processes.

Good knowledge in Informatica MDM, IDQ, Big Data Frame work Hadoop, Hive, Impala, Hue, Map/Reduce, Microsoft Azure Cloud Services and Snowflake Database.

Implemented the Data Ware Notification alert system to our night stream jobs that notifies the status of the jobs to our Tango monitor system.

Involved in providing Production Support to various Informatica ETL jobs and Database programs.

Hands on experience on SQL, PL/SQL and good Knowledge on T-SQL stored procedures.

Good understanding of relational database management systems like AS400 DB2, Oracle, Microsoft SQL Server 2016/2012 and Tera Data.

Strong understanding of logical and physical designs used in Database schemas like Star schema and Snowflake schema used in dimensional modeling.

Development experience on Windows 2010/NT/XP, UNIX (Solaris, HP UX, AIX) platforms.

Having experience in leading team and involved in cross training the team. Possess excellent communication and interpersonal skills, ability to perform independently and as part of a team and can quickly grasp new concepts both technical and business related and utilize them as needed.

Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

Worked on developing applications using Java and J2EE technologies.

Technical Skills

ETL Tools:

Informatica Power Center 10.2/9.5.1HF2/9.0.1HF4, SSIS, IBM Data Stage, Informatica MDM

Databases:

Microsoft SQL Server 2016/2014, AS400 DB2, Tera Data, Oracle 12c/11g/10g/ 9i/8i, MS Access

Methodologies:

Star Schema, Snow Flake Schema, Fact and Dimension Tables

Modeling/ Data Lineage Tools:

ERWIN, Collibra, MDM

Query/Database Tools:

SQL, PL/SQL, T-SQL, TOAD 12, SQL*Plus, iSeries Navigator, AS400 Console

Languages:

SQL, PL/SQL, Java, UNIX Shell Scripting

Testing Tools:

Win Runner, QTP, Load Runner, SQA Suite, Test Director

Operating Systems:

UNIX, Mainframe, Windows XP/NT/2008, MS- DOS.

Scheduling/Version Control Tools:

Autosys, Control M, GIT Hub, Bit Bucket, JIRA, Version One, Confluence

Academics & Certifications

Bachelor of Technology in Computer Science and Engineering from JNTU, India.

Sun Certified programmer for Java2 Platform 1.4.

Professional Experience

Co Operative Bank, Denver, Colorado Oct ‘20 – Dec ‘20

Informatica Developer

The Main purpose of the FCL Data Warehouse (FCL DW) serves as an information repository for multiple departmental and reporting processes. CoBank recently discovered that these processes need to be supported after the Leasewave (LW) migration using FCL DW. In order to prevent disruption, the firm has decided to explore the feasibility of continuing to use FCL DW by importing data from LW, instead of InfoLease (IL).

Responsibilities:

Involved in Business Analysis and Requirement collection and Identify technical risks, assumptions, and dependencies.

Involved in writing the use cases for the current system and analyzed the stored procedures to list out the performance bottlenecks of the current system.

Involved in developing Logical and Physical Data Models using Erwin.

Designed the ETL processes using Informatica to load data from MS SQL Server 2016, MS Access and Excel spreadsheet into the FCL DWH.

Created various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Sequence Generator etc.

Involved in unit, integration and user acceptance testing of Mappings, Mapplets and sessions. .

Creating and Maintaining the Informatica sessions and ETL scripts for the daily loading into Data warehouse.

Involved in Performance tuning techniques to better performance.

Created PL/SQL procedures and the PL/SQL program units for data extracting, transforming and loading.

Environment: Informatica Power Center 10.2, Windows 2010, SQL Server 2016, UNIX, Control-M, Cognos 11

Chase, Wilmington, DE Mar ‘20 – Jun’ 20

Software Developer

The Main purpose of the CARI AUTO - (Consumer Analytics Reporting Infrastructure) finance project is to extract the data is extracted from various source systems like Flat files, Tera data and Oracle Data bases and transformed the data based on the business specifications using the Big Data framework Hadoop and loading the data into Cari Auto Data Ware house. Currently we are in a process of migrating the CARI AUTO process from BIG Data Hadoop to Oracle database by using Informatica Power Center.

Responsibilities:

Involved in Business Analysis and Requirement Gathering and Identify technical risks, assumptions, and dependencies.

Involved in building the Data Model using the Erwin Data Modular.

Worked on Data Lineage documents using MDM.

Good knowledge on Big data framework: Hadoop, Hive, Impala, Map/Reduce etc.

Involved in migrating the CARI AUTO process from BIG Data framework Hadoop to Oracle by using Informatica Power Center.

Apply concepts, industry research, best practices and agile methodologies and tools (Jira, Confluence) to implement Big Data solutions.

Involved in supporting a Hadoop-based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.

Design, plan, and develop programs to perform automated extract, transform and load data between data sources when working with large data sets.

Implement process improvements (Automation, Performance tuning, Optimize workflows)

Provide accurate and timely management status of work activity.

Environment: Big Data Frame work - Hadoop, Hue, Hive, Impala, Oracle 12c, Informatica Power Center 10.2 Windows 2010, UNIX, Autosys, Bit Bucket, SAS

Walt Disney World, Orlando, FL (Accenture) Sep ‘19 – Feb’ 20

Functional Data Integration Specialist/Informatica Developer

The main purpose of Ticket Replacement program Migration framework is to make easier Walt Disney World (WDW) ticketing experience for guest, cast and business partners, through the replacement of our end of life ticket system and to achieve future revenue goals and improve efficiency while maximizing guest and business flexibility. ATS is a legacy ticketing system which is currently used by Disney and WDW is in the process of replacing ATS, its legacy ticketing system with VGS, a new ticketing system. The Data Migration lead for the Ticketing Replacement Program to provide data migration, data quality, and data validation services in the context of the migration of ticketing data to VGA.

Responsibilities:

Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications. Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.

Involved in fixing ETL issues, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.

Extracted the Data items from different sources like flat files, DB2, and SQL Server and loaded them into Microsoft Sql Azure target database.

Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).

Provided data modeling services for a large data warehouse design.

Developed schedules to automate the update processes and Informatica Sessions/Batches.

Involved loading data to Warehouse tables, loading data to staging area using SQL Loader.

Environment: Informatica Power Center 10.2, Windows 2010, Microsoft SQL Azure, MS SQL Server 2016, UNIX, Autosys, SnApp, GIT Hub

Bank of America, New York, NY (TCS) Jul ‘18 – Apr’19

Data Analyst/Informatica Developer

The EqsDatastore ADS project goal is to verify EqsDatastore as a single provisional point and ADS (Authorized data store) for the data domain of Prime Services Client data, establishing and maintaining data management and governance procedures and controls. A collection of evidence will be documented showing compliance with Enterprise Data Management (EDM) standards and policies.

Prime services Client Data is a data domain for consolidated client transaction, position and balance information for physical and synthetic BAML Prime Service Client across multiple legal entities. This single provisioning point is used to distribute Prime brokerage client data globally to external clients and third party servicing companies for Broker Dealer client daily reconciliation.

Responsibilities:

Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.

Worked on Data Lineage from Source Systems to the downstream systems by using Collibra & Ab-Initio in order to certify EQS Data store as ADS.

Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 10.1.

Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.

Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area.

Fine-tuned Transformations and mappings for better performance.

Environment: Informatica Power Center 10.1/9.5, Windows 2010, Oracle 11g, SQL Server, UNIX, Autosys, Toad 11, Cognos 11, Abinitio, Collibra

Bank of America, NJ Mar ‘15 – Dec ‘15

Informatica Developer

Bank of America is one of the largest American multinational banking and financial services corporation in United States. It offers a wide range of financial services targeted to both the commercial and individual consumers. It covers approximately 57 million consumers and small business relationships.

The main objective of the GBAM Surveillance project is to extract the trade, credit market data from multiple sources and transform and load the data in CSDR Data Warehouse Trade, Account tables.

Responsibilities:

Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.

Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.

Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.

Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area.

Experienced in handling slowly changing dimensions.

Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.

Involved in providing Production support.

Environment: Informatica PowerCenter 9.5/9.0.1HF4, Windows 2008, Oracle 11g, SQL Server 2014, UNIX, Autosys, Toad 11, Actimize 5.7

Henry Schein Inc., NY June ‘08 – Sep ‘14

Sr. Informatica Developer

Medical Dashboard

SSIS/ SQL Server 2012/2008, MS Windows 2008/ JD Edwards /AS400 DB2, SiSense

The main objective of this project is to create the Data Warehouse reports to support Dashboard-type reporting for the Henry Schein Medical Reporting (BI team) and the data will be sourced from the Data Warehouse and other data sources. The data extraction channels and processes will be addressed separately. This project is entailing the creation of the summary reports.

Responsibilities:

Understood and analyzed business requirements and the high level documentation

Developed SSIS Packages, Workflows based on business requirements using various kinds of Transformations like Join, Expression, Filter and look up were used to implement simple and complex business logic.

Created reusable transformations and Mapplets and used them in Mappings.

Worked with serial and multi files.

Fine-tuned Transformations and mappings for better performance.

Created and Configured Worklets and Sessions to transport the data to target warehouse using Workflow Manager.

Created various tasks like Event wait, Event Raise, E-mail and command etc.

Tested the mappings with mock up data and debugged the code for errors before sending it to deployment.

Troubleshoot problems by checking Sessions and Error Logs. Also used Debugger for complex problem troubleshooting.

Improved the performance by using look-ups and checkpoints wherever necessary.

Power Center 9.5.1HF2 Upgrade

Power Center 9.5.1HF2/ 9.0.1HF4, SQL Server 2008, MS Windows 2008 64 –bit

As an Informatica/ETL developer, performed in a pivotal role leading migration of companywide Informatica upgrade from Power Center 9.0.1HF4 to Power Center 9.5.1HF2 version.

Responsibilities:

Created the upgrade plan to migrate all internal applications, data marts and data warehouse to Informatica Power Center 9.5.1 HF2.

Worked on Installing Power Center 9.0.1 HF4 and Upgraded to Power Center 9.5.1HF2 Version on 3 servers (Development, QA and Production Servers).

Implemented the Repository, Domain Backup’s and Restore and involved in Power Center Administration.

Implemented the test plan which includes all production mappings, sessions, workflows.

Worked on Upgrading multiple versions of Informatica Power Center versions i.e. from Power center 8.1.1 SP5 to PC 9.0.1HF4/PC 9.5.1HF2/PC9.6 versions.

Alpha Scientific Integration Customer

Power Center 9.5.1HF2/ SQL Server STD Edition 2008/ MS Windows 2008/JD Edwards AS400 DB2

Alpha Scientific is a specialty pharmaceutical company that develops and commercializes health care products. Henry Schein has entered into an agreement with the company to market the items for Alpha Scientific. In this Integration request we add Alpha Scientific sales history to the Acquisition Fact table for 2014 and 2013. We built the new mappings based on business rules to load Acquisition Sales history data in Data Warehouse.

Responsibilities:

Involved in Business Analysis and Requirement collection and Identify technical risks, assumptions, and dependencies.

Involved in writing the use cases for the current system and analyzed the stored procedures to list out the performance bottlenecks of the current system.

Involved in developing Logical and Physical Data Models using Erwin.

Designed the ETL processes using Informatica to load data from MS SQL Server, MS Access and Excel spreadsheet into the target Oracle database created various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Sequence Generator etc.

Created and scheduled sessions/workflows and Batches using Workflow Manager to load the data into the Target Database.

Involved in unit, integration and user acceptance testing of Mappings, Mapplets and sessions. .

Creating and Maintaining the Informatica sessions and ETL scripts for the daily loading into Data warehouse.

Tuned sessions, mappings and transformations to perform better.

Created PL/SQL procedures and the PL/SQL program units for data extracting, transforming and loading.

Used SQL Loader for transferring data to Fact and dimensions table in the Data warehouse and maintaining it.

Siebel Customer Demographics

Power Center 9.0.1HF4/ SQL Server STD Edition 2008/ MS Windows 2008/AS400 DB2

Siebel Customer demographic data was identified by the business community as the first deliverable in a larger, multi-phased effort to add Siebel data to the Data Warehouse. The

Data Warehouse is expanded to include new Siebel Customer demographics data as requested to further enhance DW customer reporting. The new sets of ETL mappings are developed based on the user requirements to load the data in Data Warehouse.

Responsibilities:

Developed mappings using transformations like Source Qualifier, Aggregate, Sorter, Router, Filter, Joiner, Expression, Connected and Unconnected Lookup, Update Strategy, Union and Sequence generator transformations.

Used Lookup Transformation to access data from tables, which are not the source for mapping and also used Unconnected Lookup to improve performance.

Used XML Transformation to move data from XML files from staging to Targets of Claimant Data.

Prepared ETL flow of data from Staging to Data Mart.

Responsible for testing the mappings and ensure that the mappings do the transformations as proposed.

Involved in moving the Repository from Development, testing and Production after duly testing

Used ERWIN 4.0 to reverse engineer the table scripts.

Involved in Creating tasks, worklets, workflows to load various dimensions that include Products, Product Categories, Coverage Type, Coverage Level, Risk Level Dimensions.

Scheduled, Run and Monitored sessions using workflow manager and workflow monitor to update data on claims, drug, physician and customer data.

Applied slowly changing techniques and Dynamic lookup techniques.

Expert in advanced Mapping, Workflow techniques.

Defined Target Load order Plan for loading data into Target Tables.

Involved in dealing with performance issues at various levels such as target, sessions, mappings and sources.

Collaborated with Business Users and the DBA for requirements gathering and business analysis.

Worked with heterogeneous sources from various channels like Oracle, flat files.

Involved in Extraction, Transformation and Loading (ETL) of data using Informatica Power Center 9.0.1HF4.

Sourced data from different RDBMS like SQL server, Microsoft EXCEL and Flat files and loaded into the Oracle data warehouse.

Involved in defining the new dimensions, fact tables and collection process.

Implemented SCD methodology including Type 1, Type 2 changes to keep track of historical data.

Extensively used debugger to find out errors in mappings and later fixed them.

Tuned existing mappings, sources, targets and sessions to improve the performance of work units.

Migrated mappings from Development to Production folders.

Used the Workflow manager to create workflows, worklets and sessions.

Created, scheduled, and monitored the sessions and batches on the basis of run on demand, run on time using Informatica Power Center 9.0.1HF4.

Vendor Reporting System for Henry Schein Canada

Power Center 9.0.1HF4/ SQL Server STD Edition 2008/ MS Windows 2008/JD Edwards AS400 DB2

The Vendor Reporting System (VRS) provides sales information, on a daily basis, to Henry Schein Canada (HSC) vendors and business partners. VRS has a multi-tier design: a font end desktop application supports vendor and subscription maintenance, a backend report engine utilizes the Data Warehouse data for reporting. We have added a new Daily Sales Reporting capability to the VRS. We need a mechanism to allow the Daily Sales Reports to run. This mechanism has to update the VRS Run_Schedule table with a record that has the Run_Date and the reportable GL Date in the SQL_Date field. The timing is significant. For that reason; we want to create an Informatica session/mapping to run after the Daily Sales has completed successfully. This mapping will insert a record in the VRS Run Schedule Table to access the reports.

Responsibilities:

Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.

Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data.

Designed and developed Power Prompts for Web Users for Flexible Querying.

Extracted the Data items from different sources like flat files, DB2, and SQL Server and loaded them into Oracle target.

Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).

Business intelligence reporting development to design, develop and implement operational reporting and OLAP analysis.

Provided data modeling services for a large data warehouse design.

Developed mappings using Aggregate, Join, Router, Look up and Update transformation rules (business rules).

Developed schedules to automate the update processes and Informatica Sessions/Batches.

Involved loading data to Warehouse tables, loading data to staging area using SQL Loader.

As part of a data conversion and performance improvement project, implemented improvements to the ETL by,

Replaced complete AS400 RPG extract process that extracts the data from JD Edwards’s mainframe system to our staging environment by building Informatica mapping which improved excellent performance in the Data load.

Created new batches that utilized parallelism better.

Analyzed and tested scheduling strategies to fine tune our Informatica batches.

Replaced lookups in mappings with join conditions and simplified mappings by removing unneeded objects.

Broke up complicated slow mappings into multiple mappings that ran much faster and could be run in parallel.

Added AS400 program as preprocess in ETL that clears the large volume of tables that are our daily refresh which improved the performance.

Implemented design improvements that provided increased scalability of the target data marts. These included the increased parallelism mentioned above to allow sessions to be spread across all available processors. This scalability proved crucial in allowing enterprise wide usage of data mart.

Implemented documentation standards and practices to make mappings easier to maintain.

Current Corporate and Divisional Pricing

Power Center 9.0.1HF4/ SQL Server STD Edition 2003/ MS Windows 2003/ JD Edwards /AS400 DB2

The availability of various Pricing structures in the data warehouse is a key component in understanding and maximizing our profit margin. There are various prices available for each item at Henry Schein including corporate price, divisional price, sales plan price, promotional price, customer special price and other variations. Sales plan pricing is currently available in the data warehouse. In this project we expand pricing data in the data warehouse to include both current corporate and current divisional pricing for US and Canada.

Responsibilities:

Worked on Informatica tool –Source Analyzer, Data warehousing designer, Mapping Designer, Mapplet and designed the mappings between sources (external files and databases) to operational staging targets.

Aggregator, sequence, joiner and etc transformations used in this populating data process.

Involved in the development of Informatica mappings and also tuned for better performance.

Involved in creating and maintaining batches for weekly and daily runs and maintenance.

Validate the data in warehouse and data marts after loading process balancing with source data.

Designed the ETL processes using Informatica to load data from MS SQL Server, MS Access and Excel spreadsheet into the target Oracle database created various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Sequence Generator etc.

Created and scheduled sessions/workflows and Batches using Workflow Manager to load the data into the Target Database.

Treselle Systems Private Ltd. Bangalore India Jan ’04 to May ’04

Java Developer

iShape [www.ishape.com] is a highly specialized software program delivered via the Internet as a subscription-based online service. This program analyzes specific needs and goals of an individual, and then dynamically creates a highly personalized diet and exercise plan. These plans are customized, integrated to achieve maximum results and delivered conveniently to the user each day via email. iShape is designed to provide customers with a unique and powerful combination of benefits found nowhere else as all the industry-leading expertise of SHAPE magazine, customized for the unique needs of each individual and delivered with the interactivity and convenience of the worldwide Web.

Responsibilities:

Involved with business partners in Requirements gathering, Analysis, Design, Development and Enhancement of website.

Always helping the clients understand their requirements better, by capturing their business needs.

Getting their complete compliance by having regular requirements review meetings.

Converting the requirements into High Level Design and Low Level Design.

Preparing Process Flow Diagrams for the Low Level Design adhering to the current architectural standards.

Creating Servlet and Velocity templates according to corporate standards.

Coordinated with the offshore team in understanding the business requirements on regular basis.

Implemented Database connectivity using JDBC API.

Client side and Server side validations are done for accuracy of data entered by end user.

Analyzed, Designed and developed UI screen design using JSP.

Involved in the Unit and Integration testing.

Conducting UAT’s to provide the data the clients are looking for.

Involve, Delegate and Manage regression and end-to-end testing in order to ensure the entire system’s performance with its interfaces.

Analyzed the end user needs and customized as per the user requests.

Environment: Java 1.4, J2EE Technologies, Struts, Hibernate, Resin Server, XML, Eclipse 3.2, Maven, Oracle 9i and CVS.



Contact this candidate