Navneeth Reddy
ETL Developer
Phone:224-***-****
Email: ********.**@*****.***
SUMMARY
* ***** ** ********** ** information technology and data warehouse development, design, mapping, extraction, migration, Data Conversion, Data validation, administration and development of ETLs.
Extensively worked in Extraction Transformation Loading (ETL) process using Informatica 9.x/8.x/7.x/6.x and developing user-centric data models, queries.
Expertise in complete lifecycle Implementation of Business Intelligence with Star and Snow flake Schemas
Strong experience in Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping and Mapplet Designer along with Workflow Manager and Monitor tools.
Excellent knowledge in data warehouse development life cycle including Software Development Life Cycle (SDLC), Relational and dimensional modeling, repository management and administration and implementation of slowly changing dimensions.
Strong understanding of the principles of DW using fact tables, dimension tables and star schema modeling.
Expert in writing critical ETL stored procedures that extract data from different data feeds, transform and load into database that is used by business users in the form of reports.
Expertise in working with relational Databases i.e. Oracle, SQL Server
Expert in designing and developing backend PL/SQL packages in database layer, functions and triggers.
Extensive experience in Requirement Gathering, Customization, Maintenance and Implementation of various Business Applications.
Hands on experience in resolving errors using Debugger, Log files of Session & workflows.
Extensive experience in writing SQL and PL/SQL scripts as per the requirement.
Efficient in trouble shooting, performance tuning, and optimization ETL Mappings.
Expertise in Teradata environment that is capable of storing terabytes of data. Worked with SQL Assistant, Teradata Manager & Teradata Administrator tools and extensively used Teradata M-Load utility with Informatica to load huge data into the tables without consuming the time and system resources.
Extensively Worked with Teradata DBQL, thus analyzing the Teradata system performance.
Extensive expertise in developing ELT using Teradata M-load, Fast Load.
Expert in developing complex test cases and performing unit and integration testing.
Experienced in gathering user specifications, design and development and smooth integration of the application.
Performed activities including execution of test plans and design of exception handling strategy. Completed documentation in relation to detailed work plans, mapping documents.
Analytical and Technical aptitude with the ability to solve complex problems Can work very well in a team as well as independently under minimum supervision
Education Details
Bachelor’s in ECE, Bundelkhand University,
Masters in ECE, New York institute of Technology, NY
Category
Skill
ETL Tools
Ab Initio (GDE 3.1.7.5, 3.1.4, 2.15, 1.15 Co>Operating system 3.1.7.3, 3.1.2.2, 2.15), EME, Data Profiler, Trillium, SSIS, Informatica PowerCenter 8.5/8.1/7.1, AWS Glue
Hadoop Eco System
Apache Hive, Spark, Kafka, Scala, Kerberos, Sqoop, Yarn
DBMS
Teradata, Oracle, Hive, Cassandra, DB2, MS-SQL Server
Reporting Tool
Tableau
Scheduling Tools
Control M, Autosys, Tivoli Maestro, Redwood
Other Tools
IntelliJ IDEA, Eclipse, SVN, PostMan, Bitbucket, Github, Jenkins, Toad, Putty, Rational Clearcase, Swagger, Express>IT, Confluence, EME
Programming Languages
Unix shell scripting, Scala, HiveQL, Python, Perl
Data Modelling/Methodologies
Data Warehousing, Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete Software Development Cycle.
Operating Systems
Linux, UNIX, Windows, Windows NT, AWS, AZURE.
Development Frameworks
Agile, Waterfall, PCI, SOX, PAAS, SAAS, IAS
Client: Charter Communication, Charlotte, NC Jan 2017 – May 2020
ETL developer
Responsibilities:
Responsible for Business Analysis and Requirements Collection.
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Reviews functional and business specifications and defines detailed ETL technical Specifications.
Worked on creating conceptual, logical, and physical data modeling.
Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems
Analyze different Schema and design common staging area.
Played major role in creating and designing Staging Area.
Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
Created Test cases for the mappings developed and then created integration Testing Document.
Prepared the error handling document to maintain the error handling process.
Created Scripts using Teradata utilities (Fast load, Multi load, Fast export).
Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart
Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
Wrote complex PL/SQL statements to ensure data consistency by filtering corrupt data and implemented various constraints and triggers to preserve data integrity
Extensively used Stored Procedures, Functions and Packages using PL/SQL.
Created indexes on the tables for faster retrieval of the data to enhance database performance.
Involved in extensive DATA validation by writing several complex PL/SQL queries and Involved in back-end testing and worked with data quality issues
Developed complex SQL queries using various joins (Sub queries and Join conditions, correlated sub queries).
Created Test cases for the mappings developed and then created integration Testing Document.
Automated the Informatica jobs using Control M, UNIX shell scripting.
Deployed Jobs into Production and used Param configuration to export various properties to make sure environment independent.
Worked on mapping tool to document the requirements.
Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.
Environment: Informatica 8.5, PL/SQL, Linux Script, Oracle, Toad, Teradata, SQL Developer.
Client: ING US, West Chester, PA Sep 2016 – Jan 2017
ETL Developer
Responsibilities:
Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into databases.
Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
Designed, implemented and documented ETL processes and projects completely based on best data warehousing practices and standards.
Involved in performance tuning and fixed bottle necks for the processes which are already running in production. Also gained 3-4 hours load time for each process.
Designed and developed process to handle high volumes of data and high volumes of data loading in a given load window or load intervals.
Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
Created complex mappings using Aggregator, Expression, Joiner transformations including complex lookups, Stored Procedures, Update Strategy and others.
Extensively used Power Exchange 8.1 for CDC.
Designed and developed table structures, stored procedures, and functions to implement business rules.
Extensively used XML transformation to generate target XML files.
Extracted data from various data sources such as Oracle, Flat file, XML, DB2 .
Extensively developed shell scripts for Informatica Pre-Session, Post-Session Scripts.
Extensively involved in migration of Informatica Objects, Database objects from one environment to another.
Developed Unix shell scripts to scheduling Scripts by using scheduling tools.
Tested data integrity among various sources, targets and various performance related issues.
Environment: Informatica Power Center 9/8.6 ETL, Power Exchange 8.1,Oracle 9i, PL/SQL, Toad, SQL,UNIX and Linux servers, Shell scripting.
Client: Bank of America, Charlotte, NC Sep 2014 – Sep 2016
ETL developer
Responsibilities:
Evaluated and documented Business processes to accurately reflect the current and future state of the process
Analyzed new applications as they relate to Business and existing applications
Prioritized Business requirements and translated those requirements into functional requirements.
Developed prototypes of a new application to be used with customers to confirm requirements.
Identified and tracked the slowly changing dimensions from heterogeneous sources and determined the hierarchies in dimensions.
Created several mappings using the Transformations like XML Sources, Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, XML XSD, XML Parser, Sequence Generator, Custom and Update Strategy.
Designed and developed several Mapplets and worklets for reusability.
Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
Implemented weekly error tracking and correction process using Informatica.
Implemented audit process to ensure Data warehouse is matching with the source systems in all reporting perspectives.
Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
Extensively used Stored Procedures, Functions and Packages using PL/SQL.
Created maestro schedules/jobs for automation of ETL load process.
Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
Involved in Unit testing, User Acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Involved in developing test data/cases to verify accuracy and completeness of ETL process.
Written UNIX shell Scripts for getting data from systems to Data Warehousing system.
Actively involved in the production support and also transferred knowledge to the other team members.
Co-ordinate between different teams across circle and organization to resolve release related issues
Environment: Informatica PowerCenter 8.1, PowerExchange, Power Analyzer, Erwin, Oracle 9i, SQL, PL/SQL, SQL Server 2008, Flat Files, XML files, Windows XP,MS Access, UNIX Shell Scripting, Sql*Loader, TOAD 8.6.1.0
Client: Teradata India Pvt Ltd, Hyderabad, India Aug 2012 – Jun 2014
Software Engineer, Research and Development Wing
Res Responsibilities
Successfully executed various projects like:
PRPD (Partial Redistribution Partial Duplication) by developing test cases for functional & performance testing and coding modules across different levels of the Teradata Database in 2012.
Teradata Columnar (Column Partitioning) project by developing test cases for functional & performance testing and coding of modules across different levels of the Teradata Database in 2011
As a member of dbsfrontline, provided support to the various teams in identifying the database issues and solving the issues.
Provided the support for Customer Support Team and handled the escalated issues.
Acknowledged with Night on the Town (NOTT) Award twice for outstanding contribution in Teradata Columnar and PRPD Projects in 2012.
Enhanced the testing validation procedure by developing the Stored Procedure's using XML Plan Developed Perl scripts to automate the quantum plan comparison testing procedure
Developed plan comparison tool using C language to compare the query plans and print the plan in tree format