SUDHEER POTU
PROFESSIONAL SUMMARY
• Nine years of strong experience in IT industry especially in Data Warehousing and Decision Support Systems analysis, design, implementation, development, testing and deployment.
• Extensive experience in loading and maintaining Data Warehouses and Data Marts using Microsoft SQL Server (SSIS & DTS) and IBM Webspare (Ascential) Datastage (Parallel & Server jobs) ETL processes.
• Extensive experience in T-SQL/SQL Development, Stored Procedures, Triggers, Views, Indexes, data integrity, Dynamic SQL, Error Handling, Data Analysis and Normalization.
• Hands-on experience on Dimensional Data Modeling using Erwin
• Extensive experience on Data modeling, Erwin 3.5.2 Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling.
• Hands on experience in BI tools; SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and Server Analysis Services (SSAS).
• Hands on experience in Banking industry
• Expert in Creating, Configuring and Fine-tuning ETL workflows designed in DTS and MS SQL Server Integration Services (SSIS).
• Extensive experience on Data Integration of Different types of data sources.
• Expert in designing, maintaining & tuning various RDBMS like SQL Server 2008/2005/2000 and Oracle 7x, 8i, 9i
• Experience in Installation, Configuration of MS SQL server 2000, 2005 & 2008 and Oracle
• Having sound knowledge on Business Intelligence tools to improve the business strategy.
• Having sound knowledge on Relational Database Management Systems.
• Having Experience in Database Backup, Recovery processes.
• Experience with Data Transformation Services, Batch processes and Import/Export.
• Successfully migrated/transfer content from Microsoft SQL Server 2000 (DTS) to Microsoft SQL Server 2005 (SSIS).
• Successfully migrated/transfer content from Microsoft Access database to Microsoft SQL Server 2005.
• Experience in Data conversion, Unicode and Non-Unicode data formats.
• Experience in building Multi dimensional Cubes using SQL Server Analysis Services
• Experience in defining MDX queries and KPIs
• Experience in cube partitioning
• Experience in .Net Scripting
• Expertise in working with various operational sources like SQL Server 2008/2005/2000, Sybase, DB2, Oracle, Flat files, XML files into staging area.
• Having hands-on experience in Installing, Configuring, Managing, Monitoring and Troubleshooting MS SQL Server 2008/2005/2000.
• Having experience on creating database tables, views, modifying tables as per requirement and management of indexes, referential identity.
• Having experience in transforming complex business logic into Database design and maintaining it by using SQL tools such as SSIS, DTS, Stored procedures, Functions and Views
• Having sound knowledge on Software Development Life Cycle (SDLC)
• Experience with BSA/AML compliance monitoring.
• Having good business and system analysis skills to understand the requirements and customize client applications.
• Self-starter with strong analytical and troubleshooting skills, ability to work independently or as part of a team
• Highly responsible and adaptable person.
• Result oriented IT professional with a consistent record of exceeding expectations.
TECHNICAL SKILLS
Databases & Related Technologies : SQL Server 2008/2005/2000, Oracle 9i/8i/7.x, MS Access, DB2 UDB, Sybase
Operating Systems : Windows 7/NT/XP/2003/2000/98 and UNIX
ETL/BI Tools : SSIS (SQL Server Integration Services), DTS, Data stage 7.5.1 / 7.5.1A
Modeling Tools : Erwin 3.5.2
Analysis Tools : SQL Server Analysis Services (SSAS), OLAP
Reporting Tools : SQL Server Reporting Services (SSRS), Report Viewer, Excel Reports.
Languages : T-SQL, PL/SQL, Visual Basic 6.0
Office Tools : Microsoft Office 2010 / 2007 / 2003 / 97
Job Scheduling : control-M, MS SQL Server Agent
EDUCATION
• Master of Information Technology
PROFESSIONAL EXPERIENCE
ProjectMix Security Lending Nov 2009 – Present
Credit Suisse, New York, NY
SQL / BI Consultant
ProjectMix is an SQL Server, Data Sourcing, Analytics & Reporting Application which sources data from various Sybase sources GSL, PSDW, ACE & MIS periodically (once in a day). These source systems are trading platforms for Global Prime Lending in Credit Suisse. Data Sourcing is scheduled through control-M application. ProjectMix performs cleansing, data transformation; applies business rules to enrich the data and creates a central data repository in the SQL Server. This data repository is used by the SSRS for generating the Reports. This repository is also used as an Intermediate Data Source for building the Data Mart. When the data acquisition & calculation is done an email notification is sent to the Users with SSRS Report Link defaulted to the last business date. All the SSRS reports are hosted on the IIS and secured for each user permissions
Responsibilities
• Designed and implemented complex SSIS packages to migrate data from multiple data sources for data analysis, deployment and dynamic configuration of SSIS packages.
• Created, Configured and Fine-tuning of ETL workflows in MS SQL Server Integration Services (SSIS) 2005.
• Extracted and reviewed data from heterogeneous sources from OLTP to OLAP using SQL Server Integration Services (SSIS) 2005.
• Designed, reviewed, and created primary database objects (tables, views, indexes, etc.) based on logical design models, user requirements and physical constraints.
• Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
• Handled SSIS package deployment to various environments (QA, UAT and Production) using Deployment Manifest.
• Created and Scheduled control-M jobs (SSIS packages) using control-M application and SQL Server Agent.
• Defined control-M jobs, webdocs, maintaining control-M job dependency using In/Out conditions.
• Monitoring SSIS package production batch execution through Control – M; and fix them any failures happen while running the production batch process.
• Actively involved in up-gradation of SQL Database Server and SSIS Application server from 32 bit to 64 bit; and carried out all SSIS packages, Configuration files and related source file folder structures to various environments.
• Handled Version Control through Tortoise SVN by regular check-in and check-outs.
• Extensively created and modified RFC’s using Remedy Change Management (ITSM 6).
• Created RFCs for new database, taken initiative to get approvals and work with DBA’s to create new databases.
• Handled SQL Server database backup and Restore process.
• Handled daily production database script backups and stored them in SVN.
• Used Ws FTP Pro 6.0, to connect and download FTP files from various ftp servers.
• Used Beyond Compare tool, to compare different files and stored procedures.
• Extensively used Embarcadero RapidSQL to create and update database tables, views, stored procedures and complex queries on Sybase database systems.
• Installed and configured IBM DB2 ODBC drivers to connect and get source data from DB2 servers.
• Created and maintained SQL Server database objects like tables, views, stored procedures and Indexes.
• Created and maintained SQL Server database users and user groups; and assigned Roles and Privileges to user groups.
• Performance optimization on SSIS packages, stored procedures by creating Indexes.
• Generated Sub-Reports, Drill down reports, Drill through reports and Parameterized reports using SSRS and Report Builder
• Created reports to retrieve data using Stored Procedures that accept parameters.
• Created jobs to generate daily, weekly and monthly reports for the management.
• Generating of daily excels (profitability) reports and stored them in share point server; and send shared link email to all business users.
• Provided step by step documentation of created SSIS packages along with business logic.
• Documentation of control-M batch process, job dependencies, source systems information, upstream system contacts etc.
Environment
SQL Server 2008, SQL Server 2005, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Erwin, VB Script, .Net3.5, Sybase, Embarcadero RapidSQL 7.5.5, MS Access, Ws FTP Pro 6.0, XML, Unix, Microsoft Visio, MS Project, Quality Center and Remedy-ISTM.
AREIS (Automated Real Estate Investment System) Aug 2008 – Oct 2009
CalPERS (California Public Employees Retirement System), Sacramento, CA
SQL / Interface Developer
CalPERS provide retirement, health, and related financial programs and benefits to more than 1.6 million public employees, retirees, and their families and more than 2,500 public employers.
AREIS project is involved in Data Exchange (hereby referred to as Interfaces) strategy for the Automated Real Estate Investment System (AREIS) project. It also describes the various Interfaces, Implementation phases, roles and responsibilities. The AREIS Implementation will require exchange of data (interfaces) with a number of Internal and External systems. Interfaces implementation will be carried out in five phases - define, design, build, test, and deploy.
Interfaces are:
MRI to IMPACT
PeopleSoft G/L
Foreign Currency Exchange Rate Information
Bank of America Reconciliation
State Street Bank Reconciliation
State Street Bank Cash Managers
Argus
Responsibilities
• Created and edited the Design Specification documents for the Interface SSIS Packages.
• Design, Source to Target Mappings between sources to operational staging database, using Star Schema
• Created the Interface Packages based on mapping document using SQL Server Integration Services(SSIS)
• Extensively handled data Integration and data Cleansing
• Extensively designed the packages and data mapping using Control Flow Task, Sequence container, For each loop container, Execute SQL Task, Dataflow Task, Data conversion component, Derived Column component and Script component in SSIS Designer.
• Handled Performance tuning of the developed packages to ensure faster data loads
• Handled Unicode and Non-Unicode data formats by using Data Conversion task
• Participated in Create the required database tables, Indexes and load the data.
• Standardized the component tasks and names as per given naming standard document.
• Handled Version Control through CalPERS Dimensions software.
• Participated in creating the required stored procedures.
• Created Data source, data source views for SSAS cubes
• Participated in building of Multi dimensional Cubes using SSAS.
• Participated in partitioning the multidimensional cubes
• Participated in writing MDX queries, KPIs
• Perform optimization of SQL queries in SQL Server.
• Configured the source database, Target database, Flat file connections of the Interface SSIS packages through XML Configuration file to execute the scheduled package in SQL Server Agent.
• Created the configuration files for developed Interface SSIS packages and deploy the packages into testing and production environments.
• Parameterize all configurable values such as database connection strings, source file paths, mail servers, email addresses etc.
• Generation of reports which are in sync with the user using SSRS.
• Created tabular, matrix reports using report builder
• Integrated Quality check process with AREIS Interfaces.
• Performance optimization and normalization of database
• Database Tuning, Optimizing queries and Indexes.
• Handled configuration management using HP quality center.
• Package Scheduling, Batch, Alert and E-mail notification setting.
• Providing the client; with documentation, with component and task level details.
• Performed the Unit testing for packages developed to ensure that it meets the requirements.
Environment
SQL Server 2005, SQL Server 2000, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Erwin, VB Script, VB.Net, MS Access and XML.
Loan Data Mart - Phase 2 Oct 2007 – Jun 2008
Core logic, Sacramento, CA
SQL / SSIS Developer
Core Logic-developed analytics provide mortgage-lending Institutions, secondary market investors, and mortgage insurer’s state-of-the-art products that confirm property ownership, evaluate collateral, and prospect for new business.
Loan Data Mart Phase-2 includes in moving the developed solution in phase-1 to production & scheduling historical loads. Even few small applications like GIS were developed in this phase.
Responsibilities
• Created and edited the Design Specification documents for the SSIS Packages
• Coordinated with team members at the time of changes in Business requirements and changes in Data Mart Schema
• Involved in Dimensional data modeling by using Erwin
• Designed and developed, Source to Target Mappings between sources to operational staging database, and staging to target enterprise data warehouse using Star Schema
• Created the ETL Packages based on mapping document using SQL Server Integration Services(SSIS)
• Extensively handled data Integration and data Cleansing
• Extensively designed the packages and data mapping using Control Flow Task, Sequence Container Task, Dataflow Task, Execute SQL Task, Data conversion component, Derived Column component and Script component in SSIS Designer.
• Handled Performance tuning of the developed packages to ensure faster data loads
• Handled Unicode and Non-Unicode data formats by using Data Conversion task
• Participated in Create the required database tables, Indexes and load the data.
• Standardized the component tasks and names as per given naming standards document.
• Handled Version Control through MS Visual Source Safe
• Participated in writing the required stored procedures
• Involved in building of Data Sources, Data Source Views and Multi dimensional Cubes using SSAS.
• Perform optimization of SQL queries in SQL Server.
• Parameterize all configurable values such as database connection strings, source file paths, mail servers, email addresses etc.
• Generation of reports which are in sync with the user using SSRS.
• Integrated Quality check process with Loan Data Mart packages.
• Involved in Database Backup & Restore scripts.
• Performance optimization and normalization of database
• Database Tuning, Optimizing queries and Indexes.
• Package Scheduling, Batch, Alert and E-mail notification setting.
• Providing the client with documentation, with component and task level details.
• Developed packages to load Aggregates related to GIS (Geographical Information Systems).
• Developed Views on the Aggregate tables for GIS application.
• Performed the Unit testing for packages developed to ensure that it meets the requirements
• Got the appreciation mails from client during the project
Environment
SQL Server 2005, SQL Server 2000, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Erwin, VB Script, VB.Net, MS Access and XML.
Loan Data Mart – Phase 1 Jan 2007 – Sep 2007
Core logic, Sacramento, CA
SQL / SSIS Developer
Core Logic-developed analytics provide mortgage-lending institutions, secondary market investors, and mortgage insurer’s state-of-the-art products that confirm property ownership, evaluate collateral, and prospect for new business.
Core Logic’s products and services (primarily 3rd party scorecard) are dependent on the loan database that captures most (but not all) of the loan information that the company receives from its customers. Currently only loans provided by 3rd party scorecard customers are captured in the Loans database.
The loans database, in its current state, is sufficient for its intended purpose but the process that maintains the loan data may not be efficient, scalable and high performing.
The loan database population process involves standardizing the input loan files, assimilating all relevant loan information loading them into the loan database and aggregating key metrics around some dimensions and reporting.
This data mart shall act as a single consolidated repository of all loans.
Responsibilities
• Design and develop the ETL packages based on mapping document by using SQL Server Integration Services(SSIS)
• Participated in the review of Technical Specification and Business Requirements Specification
• Involved in the data Integration
• Involved in Dimensional data modeling by using Erwin.
• Used Transformation components like Aggregator, Lookup, Fuzzy lookup, Sort, Data Conversion, Derived column component and SCD etc... to implement the transformation logic.
• Involved in Data analysis on the final output.
• Analyzing the data model for source to target mapping.
• Created the required database tables and Indexes
• Standardized the component tasks and names based on given mapping document.
• Developed Stored Procedures for complex packages, and call them in SSIS by using Execute SQL Task.
• Perform optimization of SQL queries in SQL Server.
• Import Data from MS-Access Database and XML to SQL Server Database.
• Building of Multi dimensional Cubes using SSAS
• Maintenance of Clustered and non-Clustered Indexes.
• Designed reports with Slice and Dice & Drill down operations by using SSRS
• Create packages and monitor package history for maximum availability of data and to ensure consistency of the databases.
• Handled Version Control through MS Visual Source Safe
• Involved in Develop the test cases for each & every component that exists in the solution.
• Handled Performance tuning of the developed packages
• Documentation done for all the packages developed.
• Testing on the developed solution.
• Involved in Unit Testing the developed packages and created the Unit test cases.
Environment
SQL Server 2005, SQL Server 2000, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Erwin, VB Script, VB.Net, MS Access and XML.
USTA Jun 2006 – Dec 2006
Quaero Corporation, Charlotte, NC
SQL / SSIS Developer
Quaero Corporation specializes in identifying potential opportunities and provides marketing and campaigning on behalf of their customer.
United States Tennis Association (USTA) is a very important customer for Quaero and wants to move from SQL Server 2000 (DTS) to start using SQL Server 2005 (SSIS) and make the entire Extract and Loading of data. The customer also wanted portability for the packages so they could move from development to testing to production environment without too many changes.
Responsibilities
• Extensively designed Data mapping using Dataflow Task, Execute SQL Task, Script Task and Sequence Container Task in SSIS Designer by using SQL Server 2005.
• Created the required Packages based on mapping document, by using SSIS
• Extracts the data from different sources, cleanse and Integrated it to single format and loads into database.
• Implemented the transformation logic on source data, before loading it into target database.
• Created the required database tables and load the data.
• Created the Indexes on developed database tables.
• Standardizing the component tasks and names.
• Involved in writing the required stored procedures.
• Parameterize all configurable values such as database connection strings, source file paths, mail servers, email addresses etc.
• High level and Low level documentation for all the packages developed.
• Involved in initiating Data Quality requirements at all the Project Life Cycle (PLC).
• Appropriate tuning was performed to improve performance and provide maximum efficiency.
• Quality facilitator for this project involved reviewing if the CMM standards and policies are being adhered to within the project.
• Responsible for writing Unit test cases and executing test cases for the SSIS packages.
Environment
SQL Server 2005, SQL Server 2000, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Analysis Services (SSAS), Microsoft SQL Server Reporting Services (SSRS), Erwin, VB Script, MS Access and XML.
IBM Global Services, India Sep 2005 – May 2006
Fireman’s Fund Insurance Company, Novato, CA
Data Management Transformation (DMT)
ETL Developer
As of today, there are over 400 different metrics in existence throughout the company. Requirements for metrics tend to be business specific versus enterprise-wide. Most of Fireman's Fund's decision-support systems fail to meet the needs of the management group and important decision makers for consistent and reliable data beyond a core group of financial metrics. The most useful information on the key value drivers is typically difficult, if not impossible, to access and is often out of date. Even though the Finance organization recognizes its responsibility to meet the organization's strategic information needs, it struggles to provide strategic information in an efficient and timely manner.
Responsibilities:
• Review the ETL Specifications
• Design and Developed SSIS Packages using Control Flow Task, Data Flow Task, Execute SQL Task, Script Component and Derived column component in SSIS Designer.
• Extracts the data from different sources, cleanse & integrated and transforms it before loading into target database.
• Creating the database required tables and load the data.
• Standardizing the component tasks and names.
• Involved in converting the data from non-Unicode to Unicode
• Writing the required stored procedures.
• Appropriate tuning was performed to improve performance and provide maximum efficiency.
• Providing the client; with documentation, with component and task level details.
• Responsible for writing Unit test cases and testing for the packages.
Environment
SQL Server 2005, SQL Server 2000, Microsoft SQL Server Integration Services (SSIS), Microsoft SQL Server Reporting Services (SSRS), VB Script, MS Access and XML.
RHI (Robert Half International) Jan 2004 – Aug 2005
ETL Developer
This project related to the marketing company by giving advertising so that they will get more ads and they will improve their sales as well. The work we are responsible here is they will send the data through FTP server then we should download the data. According to the technical specification; the table structures and there are so many flat file as source data. According to the requirements; flat file data should be loaded in to the staging database tables then separate database is used for target tables. Based on mapping document data was populated to the target tables using different stages.
Responsibilities:
• Extensively used Datastage Designer to develop various Parallel jobs to extract, cleanse, transform, integrate and load data into Enterprise Data Warehouse tables.
• Design and development of ETL Jobs using Datastage parallel extender.
• Extensively used Datastage Manager to import/export metadata, Datastage Components between projects
• Creating the database required tables and load the data.
• Standardizing the component stages and names.
• Used plug-in stages such as Join, Aggregator, Links Practitioner/Link collector and Inter-Process.
• Extensively worked with Datastage Shared Containers for Re-using the Business functionality
• Develop and maintain job sequences.
• Involved in review the specifications, jobs and sequences.
• Involved in error handling.
• Providing the client; with documentation, with component and task level details.
• Involved in creation of Unit Test plans.
• Involved in Execution of Unit Test Plans.
Environment
Ascential Datastage 7.5.1, Oracle 8i, DB2/UDB, UNIX, Erwin.
ONYX Software Corp, Bellevue, WA Apr 2003 – Dec 2003
Customer Data Warehouse
SQL Developer
The primary mission of the Customer Data Warehouse (CDW) project is to build a single conformed dimensional database to enable users to query, analyze; and use data from multiple disparate source systems in one place with various end user tools.
The goals of the CDW data architecture are:
• Ability to guarantee the CDW data is as accurate as the source system data.
• Ability to support multiple concurrent queries with reasonable return times.
• Ability to have the CDW database on-line and available during normal business hours.
• Provide a comprehensive and scalable architecture that can reliably serve REI’s marketing requirements.
Responsibilities:
• Responsible for the development of Datastage jobs based on given mapping document
• Creating the database required tables and load the data.
• Extracts the data from different sources and cleanse & transforms it before loading into target database.
• Standardizing the component tasks and names.
• Writing the required Routines
• Parameterize all configurable values such as database connection strings, source file paths, mail servers, email addresses etc
• Appropriate tuning was performed to improve performance and provide maximum efficiency.
• Performance tuning on the developed solution.
• Used Datastage to extract the data from Oracle database and Flat files and load it into target DB2
• Responsible for writing Unit test cases and testing for the jobs.
Environment
Datastage 6.0, Oracle 8i, DB2, Windows XP.