Raghu
C: 646-***-**** *****.***@*****.***
PROFESIONAL SUMMARY:
Progressive and innovative software professional with over 8 plus years of experience in requirement understanding, system analysis, application design, development, testing, configuration management, client/ user support services and management of applications.
8 years of Data Analysis experience in User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
Experience in designing and developing of Extract Transform and Load (ETL) processes on Teradata.
Strong Teradata along with SQL Assistant and BTEQ experience. Solid skills and expert knowledge of Teradata V15/V14/V13/V12/V6 and its utilities such as BTEQ, MLOAD, FASTLOAD, FASTEXPORT.
Good knowledge and experience in using Informatica as ETL tool.
Proven track record in planning, building, and managing successful large-scale Data Warehouse and decision support systems. Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
Experience working on the performance tuning and optimization of Teradata SQL queries.
Good knowledge about the Teradata Architecture and Teradata Concepts.
Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter and Sequence Generator.
Very good understanding of Teradata MPP architecture such as shared Nothing, Nodes, AMPs, BYNET, Partitioning, and Primary Indexes etc. Extensively used different features of Teradata such as FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assistant.
Varied experience in writing Stored Procedures, Triggers, and Functions in PL/SQL and T-SQL for performing and supporting data warehouse processes and tasks.
Proficient in Data Warehousing concepts, design techniques.
Extensively used Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
Skilled in functional and technical requirement gathering and documentation.
Handling all the domain and technical interaction with the client and the application users.
Experience in SQL Programming (Stored Procedures, Functions and Triggers) for faster transformations and Development using Teradata and Oracle.
Team player with ability to adapt to various environments with excellent communication, interpersonal skills and a strong commitment towards customer satisfaction.
TECHNICAL SKILLS:
Data Warehousing ETL
Teradata Interface tools, Teradata SQL, Teradata Utilities, Informatica PowerCenter9.x/ 8.x/7.x, ETL, Data Mart, OLAP, and OLTP.
Databases
Teradata V5/V6/V12/V13/V14/V15, Oracle11g/10g/9i, MS Access, SQL Server 7.0/2000/2005, MySQL, Netezza
Data Modeling
Physical Modeling, Logical Modeling, Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Entities, Attributes, Cardinality
Programming
SQL, PL/SQL, Unix, Shell Scripting, SQL Tuning/Optimization, Python
Tools
TOAD, SQL*Plus, Excel, Word, Control-M, SQL Assistant, SAS
Environment
UNIX, Windows XP/Vista, Linux.
EDUCATION: Bachelor of Technology(ECE), 2008, Jawaharlal Nehru Technological University, Hyderabad.
PROFESSIONAL EXPERIENCE:
Verizon, Temple Terrace, FL Feb 2015 - Present
Sr. Data Consultant / ETL Developer
Responsibilities:
Developed several overlays and unique tools for the Data Warehouse department using Informatica, Teradata and Oracle. These tools take care of collecting data from files and convert them to existing database and use the same for reporting.
Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
Used Informatica as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
Extracted data from different sources like Teradata, Oracle, flat files and loaded into DWH.
Developed scripts to load, update and send data from different source systems to target systems using Multiload, BTEQ & FastLoad utilities of Teradata.
Tracking the Code Migration to UAT server and performing the initial checkouts.
Deploy Teradata code in various environments and manage Teradata code.
Created several SQL queries and created several reports using the above data mart for UAT and user reports.
Performance tuning for Teradata SQL statements using Teradata Explain command.
Organized the data efficiently in the Teradata system using Teradata Manufacturing Logical Data Model.
Worked on enhancements of Bteq scripts, which validated the Performance tables in the Teradata environment.
Strong experience in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files.
Involved in the production support of the system for 6 months to stabilize and efficient uploading/downloading of data across various systems.
Deploy Teradata code, Pl/sql and debug the issues. Involved in Comprehensive end-to-end testing.
Supported and maintained already existing Teradata projects.
Developed Unix Korn shell wrapper scripts to accept parameters and scheduled the processes using Control.
Environment: Teradata15/14 (BTEQ, SQL Assistant, Multiload, FastLoad, Fastexport), Informatica PowerCenter 9.1/8.6.1, SQL, SAS, Tableau, PL/SQL, UNIX Shell scripts, python, MS Excel, ORACLE 11g.
Novant Health, Winston-Salem, NC Nov 2014 – Feb2015
ETL Developer
Responsibilities:
Analysis, Design, Development, Testing and Deployment of Informatica workflows, BTEQ scripts, and shell scripts.
Source System Analysis and provide input to data modeling, and developing ETL design document as per business requirements.
Developed BTEQ scripts to transform from Stage to 3rd NF and then to aggregate.
Tuned complex Teradata queries to meet performance level agreements using Statistics, Indices, and Partitioning Techniques.
Developing Informatica mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad and MultiLoad.
Extracting data from various sources across the organization (Oracle, SQL Server and Flat files) and loading into staging area.
Developing Informatica Mappings and Workflows to extract data and to load into Teradata staging area using FastLoad/Tpump utilities.
Developed Informatica mappings for source to target loading using TPT connectors (load, update, export and stream)
Using Partitions to extract data from source and load it to Teradata using TPT load with proper load balance on Teradata server.
Generating Flat files from Teradata 3NF tables using Teradata Fast Export utility, and then FTP them using shell script to a different UNIX server for the Application team’s consumption.
Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
Working closely with QA Team, Release Engineering Team and Operations Team and provide timely response for Bugs and Issues.
Documenting Production Support docs, Test cases and Transition documents.
Experience with Agile/Scrum development methodologies
Environment: Teradata13/14, Oracle 11g/10g, Informatica Power Center 8.6/9.0, SAS, Control-M, ER Viewer, Windows XP, UNIX, LINUX.
Verizon, Temple Terrace, FL May 2012 – Oct 2014
Teradata/ETL Developer
Responsibilities:
Responsible for gathering requirements from business analyst and format them according to the business needs, identify tables required for data model.
Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, writes UNIX scripts to validate, format and execute the SQLs on UNIX environment.
Involved in unit testing & integration testing. Preparing Test Cases and performing Unit Testing.
Data is loaded using MLoad, TPT and FastLoad.
Extensively used MLOAD (MultiLoad), FLOAD (fast load) on UNIX and windows environment.
Optimized high volume tables in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
Good knowledge on Teradata architecture, usage of Bteq in UNIX environment and usage of MLoad, FLoad, and TPump Fast Export.
Data loaded into Teradata from flat files using data stage dynamic sequence job with schema file concept.
Worked with Teradata DBA team in tuning BTEQ scripts to load and transform data.
Used Collect statistics, Secondary Indexes, Join indexes and Materialization as required to meet Performance level agreements.
Extensively worked on Teradata performance optimization and brought down the queries to seconds or minutes from spool out and never ending queries by using various Teradata optimization strategies.
Highly involved in Data Analysis by helping the BA providing the data they needed.
Worked on complex queries to map the data as per the requirements.
Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user acceptance testing and loading history data into Teradata.
Involved in Performance tuning for the long running queries.
Reduced Teradata space used by optimizing tables – adding compression where appropriate and ensuring optimum column definitions.
Environment: Teradata, Oracle, Informatica Power Center, SAS, ER Viewer, Business Objects, Windows XP, UNIX, LINUX.
Excellus Blue Cross Blue Shield, Rochester, NY Nov 2010 – May 2012
Teradata/Informatica Developer
Responsibilities:
Analyzing the source data coming from Oracle and working with business users and developers to develop the Model.
Developed ETL (Extract, Transformation, Load) processes on Teradata using utilities such as BTEQ (Import/Export), MultiLoad, FastLoad, Fast Export and and Teradata SQL Assistant as productivity tool.
Use Microsoft Excel for capturing application reporting.
Use TOAD to test SQL and view tables.
Organized the data efficiently in the Teradata system using Teradata Manufacturing Logical Data Model.
Manage the tickets and resolve them in a timely manner adhering to the SLAs defined
Investigating and analyzing issues reported by customer and help identify a fix Year end rollover activities
Used EXPLAIN facility in tuning Teradata SQL.
Dealt with Incremental data as well Migration data to load into the Teradata.
Reviewed codes & SQLs to ensure optimized queries & existing ETL data flows are not affected.
Used Toad and Sql Developer for Oracle.
Developed Flat files from Teradata using fast export, BTEQ to disseminate to downstream dependent systems.
Developing a number of Complex Mappings, Mapplets and Reusable Transformations using Informatica Designer to facilitate daily and monthly loading of data.
Optimized Performance of existing Informatica workflows.
Environment: Teradata13.10 (BTEQ, SQL Assistant, Multiload, FastLoad, Fastexport), Informatica PowerCenter 8.6.1, SQL, PL/SQL, UNIX Shell scripts, MS Excel, SAS, ORACLE 11g.
SunmergeSystems Inc, India Jun 2008 – Oct 2010
ETL/ Informatica Developer
Responsibilities:
Extensively used ETL to load data from multiple sources to Staging area (Oracle 9i) using Informatica PowerCenter. Worked with pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Fact and Dimension tables.
Created reusable transformations and mapplets and used them in mappings
Extensively used Informatica PowerCenter to extract data from various sources and load in to staging database.
Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
Development of scripts for loading the data into the target tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
Performed application level activities creating tables, indexes, and monitored and tuned Teradata BTEQ scripts.
Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
Tuned sources, targets, mappings and sessions to improve the performance of data load.
Created several Bteq, Fastload and Multiload scripts to load backfill data to Data Warehouse.
Involved heavily in writing complex SQL queries based on the given requirements such as complex Teradata Joins, Stored Procedures, and Macros etc.
I created several Informatica Mappings to populate the data into dimension and fact tables.
Involved in Performance tuning for sources, targets, mappings, and sessions.
Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
Implemented Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
Environment: Informatica PowerCenter, Oracle, Teredata, SQL, PL/SQL, TOAD, SQL*Plus, Autosys, Erwin, Windows, UNIX.