Post Job Free
Sign in

Sql Server Data

Location:
Newtown, PA, 18940
Posted:
January 17, 2017

Contact this candidate

Resume:

SREENIVAS M

732-***-****

******.****@*****.***

SUMMARY

• 10 years of Strong experience in Complete Software development Life cycle (SDLC) which includes Business Requirements Gathering, System Analysis, Design, Development, and Implementation of Data warehouse. 6+ years of experience in ETL and 2+ years of PL/SQL development and 4+ years of experience on Pentaho platform (PDI)

• Vast experience in Data Warehouse using Informatica PowerCenter v 9.5.1/9.1/8.6/8.5./8.1/7.1//6.1 (Repository Manager, Mapping Designer, Workflow Manager, Mapplet Designer, Transformation Designer, Warehouse Designer, Star & Snowflake Schemas, using ETL tool Informatica on Oracle, SQL Server Databases etc.).

• Strong experience in installation of Pentaho Data Integration 6.0.1 and Informatica and configuration of Informatica PowerCenter 9.6.1

• Working knowledge of Informatica Power Exchange to support Data Integration requirements.

• Working knowledge of Informatica Power Exchange to support Data Integration requirements. Experience in using Oracle 10g/9i/8i, Netezza, Teradata, MS SQL Server 2005, MS Access 2007, SQL, PL/SQL.

• Strong experience in SQL Server 2008R2/2008/2005/2000 with Business Intelligence in SQL Server Integration Services(SSIS), SQL Server Reporting Services(SSRS) and SQL Server Analysis Services(SSAS).

• Excellent hands on experience in creating measures groups, Scorecards and Key Performance Indicator (KPI) and defining actions, translations and perspectives in SSAS.

• Performing Credit Risk, Operational Risk, Market Risk analysis using data reporting, data quality and governance.

• Worked with Dimensional Data warehouses in Star and Snowflake Schemas, Slowly changing dimensions and created slowly growing target mappings, Type1/2/3 dimension mappings.

• Proficient in transforming data from various sources (flat files, XML, Oracle) to Data warehouse using ETL tools.

• Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.

• Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets & Tasks and Scheduling the Workflows.

• Extensively involved in Informatica performance issue and Database performance issue, Expertise in error handling, debugging and problem fixing in Informatica.

• Experience in design of logical and physical data modeling using E/R studio.Expertise in using SQL*LOADER to load Data from external files to Oracle Database.

• Extensive experience in writing PL/SQL programming units like Triggers, Procedures, Functions and packages in Unix and windows environment.

• Exposure to on-shore and off-shore support activities

• Excellent verbal and communication skills, has clear understanding of business procedures and ability to work as an individual or as a part of a team.

TECHNICAL SKILLS

ETL Tools Pentaho Data Integration 6.0.1, Pentaho Report Designer Tableau, Informatica Power Center 9.6.1/9.1/8.6/8.5/8.1/7.1/6.1, Informatica Data Analyzer 8.5/8.1, Informatica Power Exchange, IDQ, Data cleansing, OLAP, ROLAP, MOLAP, SSIS, Autosys, Control M, Star Schema, Snowflake Schema, OLTP, SQL*Plus, SQL*Loader

Data Modeling Erwin 7.2/7.0, Toad, Oracle Designer, PL/SQL Developer 5.1.4.

Databases Postgressql, Netezza, Teradata, Oracle 11g/10g/9i/8i, MS SQL 7.0, SQL Server 2005/2000, MS Access.

Programming C, C++, SQL, PL/SQL, T-SQL, SQL*Plus, HTML, UNIX Shell Scripting. (AIX),java, vb.net, python, Json

REPORTING TOOLS

PRD, Business Objects XI 3.1/X1 R2,Crystal Reports 2008/XI, Cognos 8.4,SSRA,SSAS,

Environment UNIX, Windows 2000/2003/XP/Vista.

PROFESSIONAL EXPERIENCE

Client : Equian

Location : Denver, CO

Duration : April 2016 - Till date

Role : Senior ETL Pentaho Developer

• Designed and implemented Change Data capture(CDC) processes for Fact and Dimension tables through a combination of Time Stamps, staging (before / after images) and bridge tables

• Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.

• Used various types of inputs and outputs in Pentaho Kettle including Database Tables, MS Access, Text Files, Excel files and CSV files.

• Designed and implemented Business Intelligence Platform from scratch. Integrated it with upstream systems using Hadoop, Pig, Hive and other Big Data component for various functionalities. Make platform more resilient, also make it less configurable so that with minimal setting, we can onboard clients.

• Automated data transfer processes and mail notifications by using FTP Task and send mail task in Transformations.

• Responsible for creating database objects like table, views, Store Procedure, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.

• Reporting Data Designer bugs in Adaptive Integration to the support and creating workarounds until the bugs are fixed

• Identify, document and communicate BI and ETL best practices and industry accepted development methodologies and techniques.

• Troubleshoot BI tool problems and provide technical support as needed. Perform other tasks as assigned.

• Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.

• Good experience in design the jobs and transformations and load the data sequentially & parallel for initial and incremental loads.

• Good experience in using various PDI steps in cleansing and load the data as per the business needs

• Good experience in configuration of Data integration server to run the jobs in local, remote server and cluster mode.

• Good Experience in designing the advance reports, analysis reports and Dash Boards as per the clients requirements.

• Coordinated and assigned work to off-shore team in monitoring daily status and address road blocks if any.

Environment: Pentaho BI Server, Pentaho Data Integration, PRD, SSIS, PostgreSQL, Oracle, SQL Server, Shell, Unix, SQL profiler, XML, SSRS, Tableau, Java Script,Hadoop

Client : PFGC

Location : Richmond, VA

Duration : Jan 2013-March 2016

Role : Senior ETL Pentaho Developer,

Responsibilities:

• Setup the batches, configured the sessions and scheduled the loads as per requirement using UNIX scripts

• Extensively worked in fixing poorly designed mappings and developed schedules to automate the Pentaho Transformations and jobs

• Participated in business requirements gathering meetings along with Business Analysts to understand the scope of the project.

• Involved creating pentaho transformations and jobs from SQL server data base to PostgreSQL

• Involved in client meetings to determine the project timelines.

• Point of contact for all ETL applications

• Designed ETL frame work to load data from excel files to staging area, staging area to Data warehouse

• Code ETL scripts using Pentaho Data Integration suite 5.2, Kettle software[Spoon, Pan, Kitchen, Carte]

• Made use of various inbuilt components to create jobs like Get files from result, get rows from result, copy rows to result, set variables, get variables.

• Used various input and output components in the code like excel input/output, file input/output, table input/output.

• Good Experience in creating cubes by using Pentaho Schema Workbench.

• Installation and setup complete BI suite as per the clients requirements.

• Understand the requirements for designing the BI Architecture and ETL Architecture.

• Interact with clients for change request understanding to enhance & designing Reports, dash boards.

• Built custom Java code to manipulate dates using quarter and year information using Execute SQL script, Modified Java script, formula, RegEX evaluation components.

• Performed data extraction from OLTP to OLAP for decision making using SSIS. Extracted large volumes of data from different data sources and loaded the data into target data by performing different kinds of transformations using SQL Server Integration Services (SSIS).

• Used various in-built components like Select, add sequence, add constant, row flatter, string cut, unique rows, split field, concat fields, add a checksum, replace, value mapper, row normalizer etc.

• Used lookup components like stream lookup, fuzzy match, check if column exists, Call DB procedure, file exists.

• Extensively worked with Data Validator component to handle data quality and error out bad data.

• Used various Utility components like mail, write to log to notify user in case of a failure and send email notification.

• Implemented business rules to transform data and load data from staging tables to Data warehouse tables.

• Validated and unit tested data in ODS layer.

• Supported QA and various cross functional users in all their data needs.

• Scheduled jobs using Pentaho scheduler to run jobs nightly, weekly and monthly.

Environment: Pentaho Data Integration 5.2, Pentaho BI.5.0.1,Pentaho Report Designer, Pentaho schema, Oracle 12c, Tableau 9.1, Java, Unix, SSIS,SSRS,SSAS,SQL Server 2008, postgreSql, Netezza, Teradata, DB2 8.1, XML, Autosys, Oracle 11g, TOAD, SQL, PL/SQL, UNIX, Active

Client : Alere Health Inc

Location : Upper Saddle River, NJ

Duration : June2010-Dec 2012

Role : Senior Informatica Developer

Responsibilities:

• Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation.

• Used Informatica PowerCenter 8.6.1/8.5 and its all features extensively in migrating data from OLTP to Enterprise Data warehouse.

• Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.

• Securities Pricing management and price discrepancy analysis between Trade Web, CTM, Accounting applications, Portfolio Management data. Security Valuations data, Commissions data for trades and calculations of Fixed Income Trade Orders and Database, log checks for reasons of discrepancy.

• Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into DWH.

• Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.

• Developed mappings/Reusable Objects/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica PowerCenter 8.6.1/8.5

• Developed cubes for reporting based on the business requirements using SSAS 2008.

• Identified KPIs and created different scorecards and dashboards.

• Worked on power exchange to create data maps, pull the data from mainframe, and transfer into staging area.

• Handle slowly changing dimensions (Type I, Type II and Type III) based on the business requirements.

• Automated the jobs thru scheduling using Maestro scheduler, which runs every day by maintaining the data validations

• Involved in creation of Folders, Users, Repositories and Deployment Groups using Repository Manager

• Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.

• Involved in scheduling the informatica workflows using Autosys.

• Migrated mappings, sessions, and workflows from development to testing and then to Production environments.

• Involved in Performance tuning for sources, targets, mappings and sessions.

• Developed AXIOM Controller View Application Designing, Technology end to end implementation,Team mentoring, Delivery planning.

• Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.

• Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and transformations.

• Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).

• Created deployment groups, migrated the code into different environments. Worked closely with reporting team to generate various reports.

Environment: Informatica PowerCenter v 8.6.1/8.5 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SQL Server 2005, DB2 8.1, XML, Autosys, Oracle 10g, TOAD, SQL, PL/SQL, UNIX, Active

Client : The Health Net Inc

Location : Wood land hills, LA

Duration : Mar 08-Jun 10

Role : ETL/Informatica Developer

Responsibilities:

• Responsible for definition, development and testing of processes/programs necessary to extract data from operational databases, Transform and cleanse data, and Load it into data warehouse using Informatica Power center.

• Developed the repository manager, users, user groups and their access profiles.

• Extracted data from different sources like DB2, SQL server, flat files and loaded into DWH.

• Developed complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations.

• Developed connected and unconnected Lookup transformations to lookup the data from the source and target tables.

• Wrote SQL, PL/SQL, stored procedures for implementing business rules and transformations.

• Used the update strategy to effectively migrate data from source to target.

• Developed SQL queries, stored procedures, views, triggers, T-SQL and DTS/SSIS.

• Developed and documented Mappings/Transformations, Audit procedures and Informatica sessions.

• Used Informatica PowerCenter Workflow to create sessions and run the logic embedded in the mappings.

• Involved in unit, integration and system tests for Data warehouse.

Environment: Informatica Power Center 7.1.1/7.x (Repository Manger, Designer, Workflow Monitor, Workflow Manager), Flat files, DB2, SQL server 2005, SSIS, SQL, PL/SQL, Win 2000.

Client : BMW Financial Services

Location : Hilliard, OH

Duration : Oct 2007-Feb 2008

Role : ETL Developer

Responsibilities:

• Extensively used Informatica PowerCenter 7.1./6.1 to extract data from various sources and load in to staging database.

• Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.

• Extensively used ETL to load data from multiple sources to Staging area (Oracle 9i) using Informatica PowerCenter 7.1 Worked with pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Facts and Dimensions tables.

• Tuned sources, targets, mappings and sessions to improve the performance of data load.

• Designed Logical and Physical Modeling using Erwin Data Modeler tool.

• Experience in data migration of Informatica Mappings, Sessions, and Workflows to Data Integrator.

• Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.

• Created Several Informatica Mappings to populate the data into dimensions and fact tables.

• In charge of converting data using SQL Server 2005 using SSIS.

• Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.

• Designing mapping templates to specify high level approach.

Environment: Informatica PowerCenter 7.1./6.1, Oracle 10g/9i, PL/SQL, TOAD, SQL*Plus, Erwin, Windows, UNIX.

Client : Zensar Technologies

Location : Pune, India

Duration : Dec 2005 – August 2007

Role : PL/SQL Developer

Responsibilities:

• Interacted with end users for gathering requirements.

• Perform database tuning, monitoring, loading and back up.

• Creating prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.

• Developed procedures and functions using PL/SQL.

• Created number of database Triggers according to business rules using PL/SQL.

• Developed SQL for loading Meta data from Excel spread sheet to the database using SQL Loader.

• Extensively used PL/SQL to implement cursors, triggers and packages.

• Developed SQL script for loading data from existing MS Access tables to Oracle.

• Create record groups for manipulation of data, Perform unit and system integrated testing.

• Involved in database design, development of database, tables and views.

• Involved in application testing, deployment and production support.

Environment: Oracle 8i, PL/SQL, TOAD, SQL*Loader, MS Access, Excel spread sheet, Windows NT.



Contact this candidate