Post Job Free

Resume

Sign in

Sql Server Power Center

Location:
Davidson, NC
Posted:
April 07, 2024

Contact this candidate

Resume:

Himabindhu Eddula Mobile: +1-404-***-**** Email Id: ad4umv@r.postjobfree.com

SUMMARY

* ***** ** ********** ** leading, designing, developing, maintaining and building large business applications such as data migration, integration, conversion, data warehouse and testing.

ETL lead experience using Informatica Power Center 10 (Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository Manager, Workflow Manager, Workflow Monitor and Informatica Server).

Very strong understanding of DB Concepts, have worked on Oracle, SQL server

Expert in all phases of Software development life cycle(SDLC) – Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.

Business Requirements review, assessment, gap identification, defining the business process, deliver project road map including documentation, initial source data definition, mapping, detailed ETL development specifications, and operations documentation.

Expertise in data warehousing, ETL architecture and Metadata development.

Good Knowledge in data modeling and data analytical concepts.

Experience in working with various versions of Informatica Power Center 10.x/9.x/8.X – Client and Server tools.

Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.

Experience in creating pre session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.

Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2, SAP.

Experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views.

Expert in troubleshooting/debugging and improving performance at different stages like database, workflows, mapping.

Experience in designing the database architecture for extract, transformation, load environment and development of mapping and process using Informatica.

Closely worked with report team to meet the client requirements

Involved in writing Unit test cases for complex scenarios.

Experience in creating UNIX shell scripts.

Knowledge in Installation and configuration of Informatica server with SQL server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.

Experience includes thorough domain knowledge on life sciences Banking, healthcare information technology, Insurance & Reinsure, Pharmacy claims systems, Telecommute Industry.

Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and

Leadership capabilities with the high level of adaptability.

TECHNICAL SKILLS

Data warehousing

Informatica Power Center 10X,8.X, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, OLAP, OLTP, Star & Snowflake Schema, FACT & Dimension Tables, Physical & Logical Data Modeling

Databases

SQL Server, Oracle, Teradata,My SQL, DB2

Data Modeling

TABLEAU

BI Tools

Business Objects.

Languages

XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Operating System

HP-UX, Sun Solaris, SCO-UNIX, LINUX, Windows

Other Tools

Jira, GIT Lab, MS Visual Source Safe, PVCS, Autosys, Control M, Remedy, HP Quality

EDUCATION

BSC Mathematics.

Master in Computer Application (MCA).

PROFESSIONAL EXPERIENCE

ABBOTT, Cognizant

Sr. Associate/ETL Developer Jan 2021 to Dec 2022

Professional Experience:

Working closely with onsite and offshore team to consolidate the work on daily basis.

Collaborate with business analysts and stakeholders to gather and clarify data requirements.

Gathering the requirements and prepare the mapping documents and getting the approvals from the client.

Develop and maintain ETL (Extract, Transform, Load) processes using Informatica PowerCenter.

Design and optimize SQL queries to extract, transform, and load data from various sources into data warehouses.

Data profiling for the prod data importing and validating the changed business rules.

Perform data validation and quality checks to ensure accuracy and reliability of data.

Troubleshoot and resolve data integration issues in a timely manner.

Create and maintain technical documentation related to ETL processes.

Reviewing the ETL Mapping sheet and identifying the test scenarios.

Experience in Dimensional Modelling in creating various data marts like Star and Snowflake Schema, Identifying Facts and Dimensions (SCD Type I, SCD Type II), Physical and logical data modelling

Migrate the code into the higher environments using MKS.

Analyzed the data using TABLEAU.

Responsible for creating and maintaining database Architecture document for PL/SQL ETL Framework.

Worked closely with database architects, Cognos report team-preparing the report quires to fulfill the Client requirements.

Set up the scheduling and do the maintenance in the production.

Created understanding documents for module that are added and modified.

Environment: Informatica Power Center 10.2, Toad for Oracle, Putty for UNIX, WINSCP for files, Informatica TDM for data profiling, AUTOSYS for scheduling, TABLEAU,MKS for migrating into higher environments,.

ECOLAB, ST Paul- MN

SPRS Systems

ETL Onside Coordinator March 2016 to July 2016

Professional Experience:

Worked closely with client and offshore team to consolidate the work on daily basis.

Reviewed the ETL Mapping sheet and identifying the test scenarios.

Written Test Cases for ETL to compare Source and Target database systems.

Created DDL scripts to create database schema and database objects like tables, stored procedures, views, functions, and triggers using T-SQL.

Developed test scenarios/scripts and data requirements for Consigned Inventory

Involved in the scrum sprint planning, Demo and Retrospectives.

Attended the scrums daily & updating the scrum board and coordinating with team for completion of requirements

Created understanding documents for some module covered that included.

Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.

Implemented Agile scrum methodology and Perform agile testing & requirements based functional, integration,

system, database, regression testing and involved in user acceptance testing related QA activities.

Screenshots and detailed descriptions which would facilitate smooth knowledge for team member & other teams.

Attending Business reviews and walkthroughs of flow diagrams along with business users to discuss the various

"business-critical" flows

Supporting client team post release for future enhancements, change requests that adds Value to business and end

user

Work closely with development on a day to day basis to communicate issues/observations found during the test

execution.

JPMC, Cognizant Dec 2012 to Sep 2014

Associate/ ETL Developer

Professional Experience:

Created new mappings according to business rules to extract data from different sources, transform and load target databases.

Debugged the failed mappings and fixed them.

Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.

Defects were tracked, reviewed and analysed.

Modified the mappings according to the new changes and implemented the persistent cache in several mappings for better performance.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

Involved in writing stored procedures and shell scripts for automating the execution of jobs in pre and post sessions to modify parameter files, prepare data sources.

Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.

Identified the issues in sources, targets, mappings and sessions and tuned them to improve performance.

Created and used reusable mapplets and worklets to reduce the redundancy.

Developed robust Informatica mappings and fine-tuned them to process lot of input records.

Environment: Informatica Power Center, Teradata, Oracle, SQL, PL/SQL, SQL*Loader, Toad, UNIX Shell Script, Win XP, Business Objects.

Wind Stream,

Prodapt, July 2010 to Oct 2012

ETL Developer.

Professional Experience:

Used Informatica Workflow Manager to create, schedule, execute and monitor sessions, worklets and workflows.

Debugged the failed mappings and fixed them.

Involved in Creation of Dimensions using Star and Snowflake Schema in SQL Server Analysis Services (SSAS).

Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.

Defects were tracked, reviewed and analysed.

Modified the mappings according to the new changes and implemented the persistent cache in several mappings for better performance.

Identified the issues in sources, targets, mappings and sessions and tuned them to improve performance.

Environment: Oracle, Informatica Power Center 8., UNIX, Control M, Toad, SQL, PL/SQL, Flat files, Unix Shell Scripting and Business objects, Windows Servers.

Century Link,

Patni (Abyss Horizon),

ETL Developer. Jan 2010 to May 2010

Responsibilities:

Prepared the S2T documents.

Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.

Defects were tracked.

Modified the mappings according to the new changes and implemented the persistent cache in several mappings for better performance.

Involved in writing stored procedures and shell scripts for automating the execution of jobs in pre and post sessions to modify parameter files, prepare data sources.

Identified the issues in sources, targets, mappings and sessions and tuned them to improve performance.

Environment: Informatica power center, Oracle9i, SQL, PL/SQL, Solaris, MS Vision, Ms-Access



Contact this candidate