Post Job Free

Resume

Sign in

Power Center Sql Server

Location:
Cary, NC
Salary:
$130k
Posted:
September 20, 2023

Contact this candidate

Resume:

Swathi Allishetty

adztw9@r.postjobfree.com

Contact: 510-***-****

SUMMARY

Above 8 years of experience in IT in Production Support, analysis, design, development of software applications, expertise in Data warehousing solutions using Informatica Power Center, Aptitude, Power Mart and generating OLAP reports using Business Objects.

Experienced in providing Tier 2 and 3 Production support to Autosys batch processes and perform trouble shooting using power center logs, LINUX/UNIX scripts and DB processes

Above 8 years of experience in ETL mechanisms using Informatica Power Mart & Power Center 8.5.1/9.1/8.x/7.x/6x

Above 8 years of experience in RDBMS database systems using ORACLE 9i/10g/11g, SQL Server, DB2 in client/server environment.

Strong business understanding of verticals like Banking, Finance, Health Insurance and networking & Tele communications.

Extensive Experience on Oracle database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables with in committed timelines.

Expertise in maintaining data quality, data organization, metadata and data profiling.

Experience in Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.

Extensive experience with development, testing, debugging, implementation, documentation and production support.

Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

Excellent knowledge and experience in data warehouse development life cycle, dimensional modeling, repository management, implementation of STAR, Snowflake schemas and slowly changing dimensions.

Experience in performance tuning of sources, transformations, targets, mappings, worklets, workflows, sessions and batches.

Proficient in using Informatica workflow manager, workflow monitor, server manager, PMCMD (Informatica command line utility) to create, schedule and control workflows, tasks, and sessions.

Experience in using the Informatica command line utility pmcmd to schedule and control sessions and batches.

Experience in many flavors of UNIX, including AIX, HP/UX and Sun Solaris.

Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks.

Having strong hands on experience in extraction of the data from various source systems ranging from Mainframes like DB2, Flat Files, VSAM files, etc. to RDBMS like Oracle, SQL Server etc.

Proficient in using Informatica Server Manager to create and schedule sessions and batches.

Experience in PL/SQL Programming (Stored Procedures, Functions and Triggers).

Experience in UNIX shell scripting, automation of ETL processes using Cron tab.

Expertise in using Erwin, Designer 2000, Toad, SQL *Loader and PVCS.

Well experienced in designing and developing master/detail, cross tab, drilldown, chart, adhoc and complex reports using BI tools like Business Intelligence.

Excellent communication, client interaction and problem solving skills.

Expert in Dimensional Data Modeling, using Data Modeling, Star-Join Snowflake Schema Modeling, FACT & Dimensions Tables, Logical & Physical Data Modeling using ERWIN and Oracle Designer.

Proficient in generating reports using Business Objects XI R2 functionalities such as Queries, Master/Detail and Formula, Slice and Dice, Drilling, Cross Tab and Charts.

Solid Expertise in Oracle Stored Procedures, Triggers, Index, Table Partitions and experienced in Loading data like Flat Files, XML Files, Oracle, DB2, SQL Server into Data Warehouse/Data Marts using Informatica.

Expertise in RDBMS, database Normalization and Denormalization concepts and principles.

Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Oracle.

Created tables and views based on the layout sent by Clients.

Created mapping documents and work flows and data dictionaries.

Good knowledge of Data Warehouse concepts and principles – Star Schema, Snowflake, SCD, Surrogate Keys, Normalization/ De-normalization.

Very well conversant with Oracle 9i/10g/11g and SQL Server 2005/2008, Analytical Functions, Database Design, Query Optimization and Performance tuning.

Experience with IMPORT/EXPORT, Data-Pump, Sql-Loader and built-in packages in Oracle.

Worked extensively in Development of large Projects with complete END to END participation in all areas of Software Development Life Cycle and maintain documentation.

Quick adaptability to new technologies and zeal to improve technical skills.

Good analytical, programming, problem solving and troubleshooting skills.

Working closely with other members of the team to address technical problems

Interpreting new and changing business requirements to determine the impact and proposing enhancements and changes to meet these new requirements

Developing complex SQL scripts to transform the source data to fit into a dimensional model, then to create views and materialized views in Oracle.

TECHNICAL SKILL SET

ETL Tools

Informatica Power Center 10.5.3/9.1/8.6/8.5/8.1, Aptitude

(Workflow Manager, Workflow Monitor, Repository manager and Informatica Server) and Informatica PowerExchange 8.6.1

Databases

Oracle 19c, 12g,11g, 9i, IBM DB2, MS SQL Server 2000, Sybase.

Database Utilities

SQL Plus, Oracle DBA Studio, DB2 Visualizer, SQL Loader, ODBC, SSIS, Toad, Erwin, PVCS, Calwin, Designer 2000 and Discover 2000 and MS Visio.

Languages & Packages

SQL, PL/SQL, Shell Scripting (Csh, Ksh, Bourne), Cobol

Operating Systems

Windows, Linux, HP-UX,Sun-Solaris

Testing Tools

Mercury Test Director

Hardware

IBM 3090, IBM ES-9000, Z OS/1.8 IBM Compatibles

Scheduling Tools

Control-M, Autosys

Other Tools

Udeploy, Github, Gitlab, Tortoise svn, Zenkins, ServiceNow, Jira, HP ALM

PROFESSIONAL EXPERIENCE

Cornerstone, Apex Systems/Wells Fargo, Raleigh, NC Jan 2019 – present

ETL Informatica Developer/Production Support

Description:

At Wells Fargo, we have one goal to satisfy our customers’ financial needs and help them achieve their dreams. We’re looking for talented people who will put our customers at the center of everything we do. Join our diverse and inclusive team where you’ll feel valued and inspired to contribute your unique skills and experience. Cornerstone is a Wholesale banking application designed to perform Customer Due Diligence by following Wells Fargo standards complying with Anti Money Laundering system. This application is designed for whole sale customers who want to register/apply loans from Wells Fargo bank. Cornerstone application does the background check through due diligence process. After successful completion of due diligence process people can take loans from the bank.

Provided production support to various manuals and deployment.

Provide support for Informatica workflows/mappings developed using 10.5.1 which are running into production environment at client location

Provide production ETL support, enhance existing system and design and develop new systems.

Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and customers and resolve the issues as part of production support

Supported the Production system until system is fully stabilized

Knowledge in Full Life Cycle development of Data Warehousing.

Analyze business requirements, designs and write technical specifications to design/redesign solutions.

Involved in complete software development life cycle (SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.

Creating and maintaining source-target mapping documents for ETL development team.

Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.

Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.

Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.

Worked closely with analysts to come up with detailed solution approach design documents.

Development of test cases and testing.

Environment: Informatica Power Center 9.x, IICS, Oracle 19c, Putty, Unix-AIX, HP ALM, CA Automation, SQL Developer, File Zilla, Service Now, Jira, Udeploy, Jenkins, Github and Microsoft Office etc.

Fidelity Investments, Durham, NC August 2018 – November 2018

ETL Informatica Developer

Description:

Fidelity Investments operates a brokerage firm, manages a large family of mutual funds, provides fund distribution and investment advice, retirement services, wealth management, securities execution and clearance, and life insurance. Fidelity does this by focusing on a diverse set of customers: - from 23 million people investing their life savings, to 20,000 businesses managing their employee benefits to 10,000 advisors needing innovative technology to invest their clients’ money. We offer investment management, retirement planning, portfolio guidance, brokerage, and many other financial products.

Involved in all phases of SDLC from requirement gathering, design, development, testing, Production and support for production environment.

Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.

Develop the mappings using needed Transformations in Informatica tool according to technical specifications

Created complex mappings that involved implementation of Business Logic to load data in to staging area.

Used Informatica reusability at various levels of development.

Developed mappings/sessions using Informatica Power Center 9.6 for data loading.

Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.

Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.

Building Reports according to user Requirement.

Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

Optimizing performance tuning at source, target, mapping and session level

Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica Power Center 9.6, Oracle 11g, Linux-HP-UX, Jira, Control-M Enterprise, SQL Developer, File Zilla, PVCS, Microsoft Office etc.

Verizon Wireless, Folsom, CA September 2017 – June 2018

ETL Informatica Developer

Description:

Verizon Communications Inc. provides communications, information and entertainment products and services to consumers, businesses and governmental agencies. Its segments include Wireless and Wireline. The Wireless segment offers communications products and services, including wireless voice and data services and equipment sales, to consumer, business and government customers across the United States. The Wireline segment offers voice, data and video communications products and services, such as broadband video and data, corporate networking solutions, data center and cloud services, security and managed network services, and local and long distance voice services.

Responsibilities:

Involved in design, development and maintenance of database for Data warehouse project.

Involved in Business Users Meetings to understand their requirements.

Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 9.6.

Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.

Worked extensively with the connected lookup Transformations using dynamic cache.

Worked with complex mappings having an average of 10 transformations.

Created and scheduled Sessions, Jobs based on demand, run on time and run only once

Monitored Workflows and Sessions using Workflow Monitor.

Performed Unit testing, Integration testing and System testing of Informatica mappings

Coded PL/SQL scripts.

Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process

As a Data Warehousing team member, Involved in designing, developing and documenting of the ETL (Extract, Transformation and Load) strategy to populate the banking data from various source systems feeds using Informatica, PL/SQL and Unix Shell scripts.

Involved in Data Extraction, Staging, Transformation and Loading and worked with Oracle Application Integration

Creating Informatica mappings for populating the data into dimension tables and fact tables from ODS tables.

Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformations.

Created Informatica Mappings with PL/SQL procedures/functions to build business rules to load data using transformations like Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups, Filters and Sequence, External Procedure, Router and update strategy.

Created sessions and batches. Used parameter files to pass variable values.

Generated Reports using Business Objects.

Created Dimension objects keeping in view of the end-user requirement. Created User Prompts, Conditions and Filters to improve the report generation.

Used Hierarchies for providing drill down options for the end-user.

Created Master Detail and Cross Tab Reports.

Environment:

Informatica Power Center 9.6, Microgen Aptitude, Oracle 12c, Linux, UNIX, TOAD, SQL, PL/SQL, SQL Server 2005, DB2, IBM AIX, Win SCP and Microsoft Office- MS Word, MS Excel, MS Visio etc.

Brocade Communication Systems, San Jose, CA Aug 2016 – July 2017

ETL Informatica Developer

Description:

Brocade is a developer and manufacturer of Networking products and solutions. Brocade serves a large range of customers and industries in more than 160 countries, including 96% of the Global 2000. Revenues for 2012 were $2.3B. Brocade Communications Systems Brocade offers data center networking solutions that enable organizations to manage information assets. With solutions such as Storage Area Networks (SANs) and File Area Networks (FANs), Brocade addresses information management challenges such as data backup, storage and server consolidation, and business continuance.

Responsibilities:

Involved in design, development and maintenance of database for Data warehouse project.

Involved in Business Users Meetings to understand their requirements.

Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 9.5.

Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.

Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.

Worked extensively with the connected lookup Transformations using dynamic cache.

Created and scheduled Sessions, Jobs based on demand, run on time and run only once

Monitored Workflows and Sessions using Workflow Monitor.

Performed Unit testing, Integration testing and System testing of Informatica mappings

As a Data Warehousing team member, Involved in designing, developing and documenting of the ETL (Extract, Transformation and Load) strategy to populate the banking data from various source systems feeds using Informatica, PL/SQL and Unix Shell scripts.

Involved in Data Extraction, Staging, Transformation and Loading and worked with Oracle Application Integration

Created sessions and batches. Used parameter files to pass variable values.

Environment: Informatica Power Center 9.5, SQL, Oracle 12c, HP ALM, My SQL, SQL Developer, Win SCP, HP-UX, UNIX, Windows XP, PVCS, Microsoft Office etc.

T. Rowe Price, Owing Mills, MD Oct 2015 – July 2016

ETL Informatica Developer

Description:

T. Rowe Price is an American publicly owned investment firm, headquartered in Baltimore, Maryland. The company offers mutual funds, subadvisory services, and separate account management for individuals, institutions, retirement plans, and financial intermediaries. Additionally, the organization offers investment planning and guidance tools.

Responsibilities:

Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.

Analyze business requirements, designs and write technical specifications to design/redesign solutions.

Involved in complete software development life cycle (SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.

Developed technical design documents (HLD and LLD) based on the functional requirements.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter.

Creating and maintaining source-target mapping documents for ETL development team.

Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.

Develop Mappings and Workflows to generate staging files.

Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.

Created multiple Mapplets. Workflows, Tasks, database connections using Workflow Manager.

Recovering the failed Sessions and Batches.

Extracted the data from DB2, CSV and Flat files.

Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance. Understanding the Functional Requirements.

Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.

Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.

Worked closely with analysts to come up with detailed solution approach design documents.

Development of test cases and testing.

Environment: Informatica Power Center 9.6, Informatica Power Exchange, DB2, HP-UX, Windows XP, DB2 Visualizer, File Zilla, PVCS, Microsoft Office etc.

HP Enterprise Solutions, Folsom, CA April 2012 – Sep 2015

ETL Informatica Developer

Description:

The Hewlett-Packard Company or HP is an American multinational information technology which provides products, technologies, software, solutions and services to consumers, small- and medium-sized businesses (SMBs) and large enterprises, including customers in the government, health and education sectors.

Responsibilities:

Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.

Involved in complete software development life cycle (SDLC) including requirements gathering, analysis, design, development, testing, implementation and deployment.

Developed technical design documents (HLD and LLD) based on the functional requirements.

Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica PowerCenter.

Creating and maintaining source-target mapping documents for ETL development team.

Providing requirement specifications and guide the ETL team for development of the ETL jobs through Informatica ETL tool.

Develop Mappings and Workflows to generate staging files.

Developed various transformations like Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into target table.

Created multiple Mapplets, Workflows, Tasks, database connections using Workflow Manager.

Created sessions and batches to move data at specific intervals & on demand using Server Manager.

Responsibilities include creating the sessions and scheduling the sessions.

Recovering the failed Sessions and Batches.

Extracted the data from Oracle, DB2, CSV and Flat files.

Extracting the data using SSIS.

Implemented performance tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance.

Worked on optimizing and tuning the Oracle views and SQL’s to improve the performance of batch and response time of data for users.

Provided support during the system test, Product Integration Testing and UAT.

Provided quick production fixes and proactively involved in fixing production support issues.

Development of test cases and testing.

Coordinate with Configuration management team in code deployments.

Environment: Informatica Power Center 9.1, Unix, Oracle 11g, SQL Server 2005, COBOL, JCL,Flat Files, TOAD, Autosys, SQL Assistant 12.0, Business Objects, Korn Shell, MS Visio, Remedy, File Zilla.

EDUCATION

Bachelor of Science in Computers – Osmania University 2011



Contact this candidate