Post Job Free

Resume

Sign in

Data Sql

Location:
Fremont, CA
Posted:
November 20, 2016

Contact this candidate

Resume:

Narendra Reddy Venna

408-***-****)

acxkuu@r.postjobfree.com

PROFESSIONAL SUMMARY

Teradata 12 Technical Specialist Certified with over 5 years of experience supporting data warehouses residing on Teradata used all Teradata utilities (Multiload, Fastload, Fastexport, BTEQ, Queryman, SQL Assistant, Visual explain,) extensively.

Sound Knowledge of Data Warehousing concepts, E-R model & Dimensional modeling (3NF) like Star Schema, Snowflake Schema and database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.

Have experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.

Experience in scheduling workflows using other scheduling tools like Corn Tab, Autosys, Control-M, etc.

Proficient in using Informatica Designer, workflow manager, workflow monitor, Administration Console to create, schedule and control workflows, tasks and sessions.

Excellent Data Analysis skills and ability to translate business logic into mappings using complex transformation logics for ETL processes.

Expert in troubleshooting/debugging and improving performance at different stages like database workflows, mapping.

Worked on Big Data Technologies like Hadoop, Map Reduce, Hive, Pig and Sqoop

Extensive experience in Administration and maintenance of Dev, Stage, prod and standby databases for DSS and Data Warehousing environment

Experienced in UNIX Shell scripting as part of file manipulation and text processing.

Good experience with Teradata query performance and tuning using collect statistics and secondary indexes.

Possess sound knowledge in TeradataDatabaseArchitecture, Database Features, Teradata tools and worked on Teradata Database Internals Parser, Optimizer and the Amp modules

Programming experience in Teradata SQL.

Expertise in Application Development with knowledge of full software development life cycle.

Proficient in using Teradata Administrator for managing Databases, Users, Tables, Indexes, statistics, managing permissions (Roles & Profiles), addressing the user issues such as resetting passwords, unlocking user ID’s etc.

Strong skills in resolving Teradata issues, providing workaround for the problems, knowledge on diagnostics, Database Tuning, SQL Tuning and Performance Monitoring.

Experience in Database Programming, SQL, PL/SQL (Stored Procedures, Functions and Triggers) for faster transformations and Development using Teradata.

Participated in full Software Development Life Cycle (SDLC) of the data warehousing project planning, business requirement analysis, data analysis, logical and physical database design, setting up the warehouse physical schema and architecture, developing reports, security and deploying to end users

Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.

Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.

Hands-on experience in Teradata Performance tuning using Explain, Statistics collections, Skew factor analysis and Row Distribution.

Proficient in complex Data Warehousing concepts including in Data mart and Data Mining using Dimension Modeling (Star Schema, Snow-Flake Schema Design and Normalized structures (3NF)).

Expertise in client-server application development using ORACLE 11g/10g, PL/SQL, SQL*PLUS, TOAD and SQL*LOADER.

Strong Knowledge in oracle utilities like SQL*Loader, Export/Import and data pump utilities like EXPDP/IMPDP.

Experience with VI editor and Korn Shell on different UNIX flavors, developed different Shell Scripts for Data warehousing and migration projects.

Enthusiastic and goal-oriented team player possessing excellent communication, interpersonal skills and leadership capabilities with high level of adaptability.

EDUCATION

Masters in Computer Science, NPU.

CERTIFICATIONS

Teradata Professional Certified Developer

Teradata Certified SQL Developer

Teradata Certified Physical Implementer Developer

oAs I have all these I am considered as Teradata Technical Specialist Certified.

Scheduled my Hortonworks Hadoop Develop certification on Nov22,2016. Have confident that I can clear that.

TOOLS AND TECHNOLOGIES

Skill

Proficiency

Yrs of Exp

Year Last Used

Teradata

Excellent

5yrs

2016

Bteq

Excellent

5yrs

2016

fastload

Excellent

5yrs

2016

multiload

Excellent

5yrs

2016

fastexport

Excellent

5yrs

2016

tpump

Excellent

5yrs

2016

Teradata LDM

Excellent

5yrs

2016

Teradata Visual Explain

Excellent

5yrs

2016

Teradata Index Wiard

Excellent

5yrs

2016

Teradata Stats Wizard

Excellent

5yrs

2016

DBQL

Excellent

5yrs

2016

QCD

Excellent

5yrs

2016

Teradata SQL Performance Tuning

Excellent

5yrs

2016

C++

Moderate

3yrs

2016

Java

Moderate

3yrs

2016

informatica

Moderate

1 yrs

2016

Hadoop

Entry Level

6 months

2016

scoop

Entry Level

6 months

2016

flume

Entry Level

6 months

2016

pig

Entry Level

6 months

2016

HIVE

Entry Level

6 months

2016

EDW

Excellent

5yrs

2016

EXPERIENCE

Employer: - Exilant Jan 2016 to till Date

Client: - Apple Inc.

Sunnyvale, CA

Role: Teradata/informatica Developer (part of Global Business Intelligence Team, Apple Inc.)

Project: - Campaign Operations (Apple Music and COP Dashboard implementation)

Project:- AST2(Apple Service Toolkit)

Responsibilities

Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.

Extensively used the Teradata utilities like BTEQ, FastLoad, Multiload, DDL Commands and DML Commands (SQL).

Implementing the COP Dashboard implementation in java.

Developed TPT scripts to load data from Load Ready Files to Teradata Warehouse.

Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.

Used SQL to query the databases and do as much crunching as possible in Teradata, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc) to achieve better performance

Monitoring database space, Identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.

Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc

Flat files are loaded into databases using FastLoad and then used in the queries to do joins.

Used Teradata SQL with BTEQ scripts to get the data needed.

Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.

Used Teradata utilities (FastLoad, Multiload) to load data into Target Data Warehouse and used Teradata Sql Workbench to Query data in the target Teradata data warehouse.

Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.

Created and automate the process of freight and shrink loading process using Shell Script, Multi load, Teradata volatile tables and complex SQL statements.

Created a generic email notification program in UNIX that sends the emails to the production support team if there are any duplicate records or error in load process.

Extensively used SQL *Loader, UTL_FILE and External Table mechanisms to load legacy data, feed files data and data from various sources into the Oracle database tables.

Involved in writing PL/SQL Store Procedures, Functions, Cursors, Triggers and Packages.

Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.

Created several Teradata SQL queries and created several reports using the above data mart for UAT and user reports.

Used several of SQL features such as Group By, Rollup, Rank, Case, Union, Sub queries, Exists, Coalesce, and Null etc.

Used volatile table and derived queries for breaking up complex queries into simpler queries.

Worked on exporting data to flat files using Teradata Fast Export.

Extensively involved in data transformations, validations, extraction and loading process.

Implemented various Teradata Join Types like Inner-join, outer-join, self-join, cross-joins. And various join strategies like Merge join, Product join, Nested join, Row Hash Joins.

Involved in writing Store Procedures, Functions, Cursors, Triggers and Packages.

Various transformations (Source qualifier, Aggregators, Connected & unconnected lookups) were used to handle situations depending upon the requirement.

Working with QC and UAT teams for respect team approvals.

Working with offshore team for development and administrative activities.

Working with Teradata DBAs on automated scripts for stats collections on tables, indexes and Columns.

Interaction with different teams for finding failure of jobs running in production systems and providing solution, starting the jobs and making sure jobs complete in the specified time window.

Developed the end-user documentation and performed periodic updates of the design documents. Involved in Application Maintenance and Production Support.

Work with source system & business sponsor teams to understand requirements and translate them to technical specifications

Perform data analysis and design pertinent to the requirements

Evaluate information gathered from multiple sources, reconcile conflicts, and distinguish user requests

Collaborate with Business, PM, and other technology team

Serves as the conduit between the offshore and the onsite teams

Perform Analysis, Design, Coding, Testing and implementation, Support activities.

Perform Unit and Integration Testing Activities

Develop and maintain project documentation including technical specification, unit test plans, presentations, user documentations and training materials

Environment

Teradata 14/15, Oracle 9i, Autosys, Teradata SQL Assistant, Fast Load, MultiLoad, TPT, BTEQ, UNIX, UNICA, Campaign Portal,informatica,hadoop

Teradata Research and Development Labs JAN 2010 to Nov 2014

Hyderabad, AP, INDIA

Role: Teradata Feature Developer (Software Developer)

Responsibilities

Successfully executed various projects like:

PRPD (Partial Redistribution Partial Duplication) by developing test cases for functional & performance testing and coding modules across different levels of the Teradata Database in 2012.

Teradata Columnar (Column Partitioning) project by developing test cases for functional & performance testing and coding of modules across different levels of the Teradata Database in 2011

As a member of dbsfrontline, provided support to the various teams in identifying the database issues and solving the issues.

Provided the support for Customer Support Team and handled the escalated issues.

Acknowledged with Night On the Town (NOTT) Award twice for outstanding contribution in Teradata Columnar and PRPD Projects in 2012.

Enhanced the testing validation procedure by developing the Stored Procedure's using XML Plan

Developed Perl scripts to automate the quantum plan comparison testing procedure

Developed plan comparison tool using C language to compare the query plans and print the plan in tree format

PROJECTS UNDERTAKEN

Title: Reduce Spool Feature

Period: June13-Feb14

Description: In Teradata query processing, a SQL request is processed in a series of steps (i.e. retrieval step, join step, SUM step, etc.…) where each step would consume and produce the data which would be consumed by the next step. Between steps, the data is written into a spool and next step must read from it. The main objective of this feature is to reduce the number of intermediate spools used during query processing. It will reduce the number of I/O thus improve the overall query processing time.

Role:

Assessed the code and developing the new logic to handle this cost-based duplicate removal approach

Documented the HLSDS and SIS.

Developing the test cases to test the developed code

Fix the bugs opened in feature testing.

Title: MergeJoinCleanup

Period: Feb13-June13

Description: To separate AMP code handling special forms of merge join for PPI tables from the main line merge join code. This is a quality enhancement meant to reduce incoming regressions in queries utilizing merge join, and allow merge join to be easily enhanced in the future to support new optimizations.

Role:

Tested by developing complex sql queries for this mini feature

Title: PRPD (Partial Redistribution Partial Duplication)

Period: May’12 – Jan’13

Description: Partial Redistribution Partial Duplication was a new join strategy which was implemented to resolve the performance problems caused by data skew in equality joins on non-PI columns. Skewness could cause severe performance problems, if skewed values exist on the join columns or expressions in one or both join sources, performance degradation occurs when optimizer tries to redistribute the rows or tries to choose the local geog for one relation (table/spool) and duplicate geog for other relation, either of these would end up in a hot amp situation. PRPD handled this situation by splitting the skewed rows & non-skewed rows of both the relations (tables) into 2 or 3 different spools and joined backs these spools. This leveraged the chances of optimizer to choose a best plan and avoid the hot amp situation by equalizing the workloads on each amp.

Role:

Designed & implemented code in the Optimizer area

Assessed the code, identified the root cause of problem and rendered fix for bugs identified during testing

Looked after the Teradata ‘optimizer & the dispatcher’ database code and related bugs

Developed test cases to test the feature

Environment

Teradata 14/15, Teradata SQL Assistant, Fast Load, MultiLoad, Teradata Administrator, BTEQ, UNIX



Contact this candidate