Post Job Free
Sign in

Sr. DW Developer /SQL Server Developer /ETL Developer

Location:
Paramus, NJ, 07652
Posted:
May 28, 2009

Contact this candidate

Resume:

Shashank Kolishetty.

*******.****@*****.***

Summary:

• SDLC: 7 years of total IT experience in the Business Requirements Analysis, Application Design, Data Modeling, Development, Testing, Implementation and Support for Data Warehousing and Client/Server applications for Technology, Banking, Pharmaceuticals and Insurance Industries.

• Data Warehousing: 5 years of strong Data Warehousing and ETL experience using SQL Server 2005 SSIS/SQL Server 2000 DTS, Informatica Power Center 7.1.4/6.2/5.1,IMS Data, OLTP, OLAP.

• Data Modeling: 3 years of Data Modeling experience including Dimensional Data Modeling, Star Join Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling, ERWIN 4.x/3.x, Oracle Designer.

• Business Intelligence: 3 years of Business Intelligence experience in SQL Server Reporting Services 2005, Business Objects 6.5, Cognos.

• Databases: 7 Years of database experience using MS SQL Server 2005/2000/7.0, Oracle 10g/9i/8i/7.x, Teradata V2R6/V2R5, Sybase 12.0/11.x, DB2 UDB 8.0/7.0, AS/400, MS Access 7.0/2000

Education: Masters in Computer Sciences

Technical Skills

ETL Tools SQL Server SSIS/DTS, Informatica 7.1.4/6.2/5.1(Power Center/Power Mart)

Data Modeling Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 3.5.2/3.x

Business Intelligence SSRS, Business Objects XI r2/6.5.1/6.1b/5.1, Cognos, MS Access Reports

Databases MS SQL Server 2005/2000, Teradata V2R6/V2R5, Oracle 10g/9i/8i/8.0/7.x,DB2 UDB,MS Access 2000, Sybase Server 12.0/11.x.

Others SQL, T-SQL, PL/SQL, Unix Shell Scripting, Perl, Visual Basic 6.0/5.0/4.0, HTML 4.0, DHTML, VB, XML 1.0, C++, Java

Environment MS DOS 6.22, Win NT/2000/2003, AIX, NCR, SVR4, UNIX, MP-RAS, HP-UX

Professional Experience

AT & T, NJ Jun 2007-Current

Sr. DW Developer

AT & T is nation’s most reliable wireless network. Implemented the enterprise data warehouse (EDW) to have a detailed view of the customer. As a ETL developer I was involved in creating various logical mappings for the data marts based on the business requirements that carry the data related to the customer having information about Bill Payment, Dues, and Plan Details etc.

Responsibilities

• Interacted with the Business Users to analyze the business requirements and transform the business requirements into the technical requirements.

• Prepared technical specifications for the development of SSIS (ETL) packages to load data into various target tables and defining ETL standards.

• Created logical and physical data models using Erwin.

• Created Entity Relationship (ER) diagrams based on requirements.

• Evaluated different ETL tools like SSIS & Informatica.

• Created ETL mapping documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment..

• Implemented SSIS with SQL Server stored procedures for data manipulations.

• Analyzed, designed, developed, implemented and maintained moderate to complex initial load and incremental load mappings to provide data for enterprise data warehouse.

• Worked on SSIS, for removing duplicate records with data standardization.

• Implemented incremental loading process with cumulative loading process.

• Identified various source systems for data feeds.

• Created mappings to load data extracted from flat files, Cobol files and XML files to staging area using SQL Server SSIS.

• Created mappings to extract data from Staging area and load into enterprise data warehouse with applying various business rules.

• Created mappings using row transformations, row set transformations, business intelligence transformations, split and join transformations and some other transformations like export column, import column, audit and SCD transformations etc.

• Extracted customer data from SQL Server database and loaded to warehouse.

• Implemented Type II slowly changing dimension to maintain current information and history information in dimension tables.

• Implemented incremental loading process with cumulative loading process.

• Implemented weekly error tracking and correction process using SSIS.

• Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives..

• Developed Custom Selection of reports ordering using SQL Server Reporting Services.

• Created maestro schedules/jobs for automation of ETL load process.

• Involved in Unit Testing, User Acceptance Testing (UAT) to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.

• Preparing and using test data/cases to verify accuracy and completeness of ETL process

• Actively involved in the Production support and also transferred knowledge to the other team members.

Environment: SQL Server 2000/2005, SSIS, SSRS, Informatica Power Center 7.1.4, T-SQL, Cobol, Flat files, TOAD 8.6, Oracle 10g, , SQL, PL/SQL, Windows NT/2000, Erwin.

Bristol-Myers Squibb, NJ May 2006 – Jun 2007

Sr. DW Developer

The project deals with the Pharma Research Data and Sales & Marketing Data collected from different tests conducted on the subjects. The data is then consolidated into the data mart which is used for the report generation

Responsibilities

• Analyzing the source data coming from different sources and working with business users and developers to develop the Model.

• Involved in Dimensional modeling to Design and develop STAR Schema, Using ER-win to design Fact and Dimension Tables.

• Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using SSIS.

• Analyzed IMS Rx Data using IMS tools such as Xponent and Xponent Plantrak, regarding segmentation & profiling.

• Design and develop SSIS packages, store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance.

• Worked with SQL Server, Flat files and SAS data sets using SSIS to extract data into Data Warehouse.

• Developed number of complex SSIS Mappings for different types of tests in Customer information, Monthly and Yearly Loading of Data.

• Build efficient SSIS packages for processing fact and dimension tables with complex transforms and type 1 and type 2 changes.

• Use SSIS to perform ETL operations for Master Data Management (MDM) needs as well as auditing reasons

• Migrated SQL Server 2000 DTS packages to SQL Server 2005 SSIS

• Created Stored Procedures for data transformation purpose.

• Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.

• Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.

• Captured data error records corrected and loaded into target system.

• Implemented efficient and effective performance tuning procedures, Performed benchmarking, and these sessions were used to set a baseline to measure improvements against.

• Created reports using SQL Server Reporting Services (SSRS), Stored Procedures and Free Hand SQL as the Data Providers.

• Formatted Reports as per the user requirements using extensive functionalities in SQL Server Reporting Services (SSRS) , which allows the users to analyze the data.

Environment: MS SQL Server 2000/2005, SSIS, SSRS, DTS, IMS data, T-SQL, Oracle 10g, Erwin 4x, Windows NT/2000, flat files.

VISA International, CA Nov 2005 – Apr 2006

SQL Server Developer

In the credit card Division of Visa, we built a data warehouse to analyze the transactions across finance, marketing, risk, collections and consumer relations. With clean customer data successfully implemented, the data warehouse is growing in analytical richness.

Responsibilities

• Involved in Business Analysis and requirement gathering.

• Responsible for verification of functional specifications

• Used DTS jobs for importing data from legacy systems to the current model.

• Developed all the required stored procedures, user defined functions, triggers using T-SQL.

• Used SQL Server 2000/2005 reporting services for generating reports.

• Used SQL Server profiler for auditing and analyzing the events which occurred during a particular time horizon.

• Involved in performing Extraction, Transformation and Loading using DTS.

• Involved in Data Integration by identifying the information needs within and across functional areas of an enterprise database upgrade and scripting/data Migration with SQL server Export Utility.

• Used “SQL Profiler TSQL_Duration” template for tracking execution time of TSQL Statements.

• Development and implementation of Backup and Recovery strategies.

• Optimized the performance of queries with modifications in T-SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data ,normalized tables, established joins and created indexes whenever necessary.

• Planned a complete backup on the database and restored database from disaster recovery.

• Day-to-day activities such as backups to the disk and restore the databases and transaction log on production/development servers as per business / IT needs.

Environment: MS SQL Server 2000/2005, SSIS, SSRS, DTS, T-SQL, Oracle 10g, Erwin 4x, flat files.

Bank of America, Charlotte, NC Aug 2004 – Nov 2005 ETL Developer

Integrated Standard Banking System (ISBS) is a huge banking application, which deals with several areas of retail banking. It has several modules named Savings, Current, Deposits, Loans, ATM transactions.

Responsibilities

• Responsible for analysis, design and development of Enterprise Data Warehouse.

• Responsible for the Design of the Target Data Warehouse Entity for Oracle using Power Center.

• Developed SQL Server DTS packages to import data from flat files to staging database

• Developed complex Informatica mappings & tuned them for better performance.

• Performed Source partitioning to improve session performance.

• Extensively used connected and unconnected lookup transformations.

• Designed and developed Target Warehouse Entity for Oracle using Power Center

• Extensive use of SQL overrides, and substituted them in the place of multiple transformations.

• Used ODBC to connect to the target Oracle database.

• Responsible for the designing and development of Oracle PL/SQL queries.

• Did Extensive fine-tuning on Oracle Stored Procedures for improving performance.

• Created Oracle stored procedures, functions and triggers using PL/SQL.

• Was responsible for Optimization of the Query Performance and Session Performance.

• Optimized Query Performance, Session Performance and Reliability.

• Written complex stored procedures and triggers and optimized for maximum performance.

• Worked on Data Conversion and Data Analysis to meet EDW requirements.

• Scheduled and monitored automated weekly jobs under UNIX environment.

• Used parameters and variables to facilitate smooth transition between the development and production environments.

• Unit testing of individual modules and their integration testing.

• Debugged and sorted out the errors and problems encountered in the production environment.

Environment: Informatica Power Center 6.2, SQL Server 2000, DTS, Oracle 9i, SQL, PL/SQL, SQL* Loader, IMS Rx Data, UNIX Shell Scripting, SQL Server 2000, Erwin 3.5.2, Sun Solaris 2.6.

Mercury Insurance, FL Oct 2003 – Aug 2004

ETL Developer

The project was aimed at building Claims Datamart (CDM) as a part of Insurance Data Warehouse (IDW). IDW provides the information about Policy Transactions, Premium and Claims Transactions data to the business users, This Claims Data Mart was primarily built for the purpose of claims Transaction and for analysis and report generation, which was used by the end users for Decision-making on Claims processing and approval. The source data contained information about customers, Policies, premiums, etc., required for Claims processing.

Responsibilities

• Extensively worked on Informatica Power Center 6x tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplets Designer, Transformation Developer and Workflow Manager.

• Developed various Mappings using Designer that involved different transformations like Update Strategy transformation, Lookup transformation, Aggregator transformation, and Stored Procedure transformation, Expression Transformation.

• Created reusable transformation and mapplet based on the business rules to easy the development process and responsible for document the changes.

• Implemented slowly changing dimension methodology for accessing the full history of accounts.

• Written Scripts to archeive the files and to maintain same naming for the files coming from different machines and also to update parameter file.

• Used Debugger to test the data flow and fix the mappings.

• Different Performance Tuning techniques were implemented to optimize data load.

• Data Quality Analysis to determine cleansing requirements.

• Developed test cases for system integration testing

• Updated change request documents for review and approval, in response to modification requests.

Environment: Informatica Power Center 6.2, Oracle 8i, SQL Server 2000, Business Objects 6.5, DB2, Flat Files, XML, Unix, Windows 2000

Lucent, Bangalore, India Feb 2003 – Sept 2003

Data Warehouse Developer

Lucent have developed a central data warehouse for sales. The Data Warehouse development significantly enhanced Lucent Flexibility, Proprietary Competitive Advantage, and Improved Service Levels in serving the critical needs of the sales team. Project involved in developing Enterprise Subject Data warehouse to implement a centralized database that collects and organizes and stores data from different operational data sources to provide a single source of integrated and historical data.

Responsibilities

• Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.

• Responsible for developing Source to Target Mappings.

• Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer.

• Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.

• Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.

• Developed mappings, sessions for relation and flat source and targets.

• Imported data from various source (Oracle, Flat file, XML) and transformed and loaded into targets using Informatica.

• Created and scheduled Session, jobs based on demand, run on time

• Monitored Workflows and Sessions

• Developed Unit test cases for the jobs.

• Involved in developing Adhoc reports using Business Objects.

• Identified the facts and dimensions and designed the relevant dimension and fact tables.

Environment: Informatica 5.1, Oracle 8i, Erwin 4.0, PL/SQL, UNIX and Windows NT.

Cadila, Bangalore, India Apr 2002- Jan 2003

Oracle Developer

This project was developed to Automate Personnel Department of Ramky Constructions, Hyderabad. This module helps the HR Department and Personnel Department in screening of Candidates Applications, Conducting Interviews, Training Programs, Generation of Pay Slips, Leave Details, Yearly Performance Appraisals.

Responsibilities

• Developed PL/SQL stored procedures, packages, and triggers for data load and transformation, and data extraction.

• Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.

• Loaded data from the files got in different formats into the development and production systems using SQL*Loader.

• Performed Shell Scripting using VI editor.

• Created External Tables to access data for interfaces from a data file instead of loading the data into the database as per storage consideration.

• Involved in writing complex stored procedures (Oracle) to facilitate functionality for invoking through Java application. Wrote complex queries using SQL.

• Maintain all documentation and spreadsheets for Oracle Database operations. Prepared documentation for database operations standards.

• Handled large volume of data, some tables having records of 100 million and some tables having an average of 50 million records.

• Troubleshooting development problems.

• Subsequent alterations of design based on user requirement changes.

• Implementation of coding and documentation standards to be followed.

Environment: Oracle 8i, TOAD, SQL, PL/SQL, MS Visio, Java, Windows 2000.



Contact this candidate