Post Job Free
Sign in

Teradata Developer

Location:
Wilmington, DE
Posted:
March 03, 2015

Contact this candidate

Resume:

Prasad Rao

Sr. Teradata Developer

Email: ***********@*****.***

Mobile:510-***-****

Job Related Summary:

1. 8 years of experience in IT Industry.

2. 7 years of experience in Teradata Developer.

3. 4 years of experience in Performance tuning, Capacity planning and security.

4. 5 years of experience in UNIX/LINUX/PERL Scripting.

5. 5 years of Experience in providing ETL & BI integration solutions.

6. 5 years of Experience in Data Modelling.

7. 2 years of Experience in Teradata administration

8. Experience on a large & complex DWH implementations.

9. Good communication and interpersonal skills.

Summary:

Over 7 years of extensive experience in data migration, data warehousing, database design, and manual testing.

Extensive experience with ETL/Data warehousing tools in Financial, Healthcare, and Retail Industry.

Involved in various stages of Software Development Life Cycle (SDLC).

Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-

depth understanding of Database Hierarchy and Data Integrity concepts.

Skilled in Data Warehousing Data Modeling using Star Schema and Snowflake Schema.

Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.

Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2, Oracle, and SQL Server and

loaded into Oracle, Teradata DW etc.

Extracted source data from Mainframe Z/OS using JCL scripts and SQL into UNIX Environment and created

formatted reports for Business Users using BTEQ scripts.

Strong Teradata SQL and ANSI SQL coding skills.

Expertise in Report formatting, Batch processing, and Data Loading and Export using BTEQ.

Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by

right primary Index, scheduling collection of statistics, secondary or various join indexes.

Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard, and Visual Explain.

Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to

target database.

Skillfully used OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum,

group by grouping set etc to generate detail reports for marketing folks.

Extensively used Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.

Involved in designing and building stored procedures and macros for the module.

Expertise in Oracle 9i/8.x/7.x, SQL/PLSQL, Procedures, Functions, Database Packages, and Triggers. And

experience in trouble shooting techniques, tuning the SQL statements, Query Optimization and Dynamic SQL.

Good knowledge of Performance tuning, Application Development and Application Support on UNIX, MVS, and

WINDOWS NT Environments.

Developed UNIX Shell scripts for Batch processing.

Responsible for writing Deployment/Release Notes before the project release and End Users Manual for production team

Technical Skills

Operating Systems: Red Hat Linux,MVS, Windows 2000 Server, Windows OS, MAC OS.

Languages C, C++,SQL, T-SQL, PL/SQL, Teratata SQL, MS Access

Databases: TD13.10/V12/V2R6/V2R5,Oracle 8i/9i, dBase, DB2

Backup and Recovery: VERITAS NetBackup, Teradata Arcmain, Taragui.

System Monitoring: Teradata Viewpoint, PMON, Teradata Manager

Teradata Utilities: BTEQ, Fast Load, Multi Load, TPT, FEXP, TPUMP, OLE Load

ETL Tools: Informatica 8.6, Data Stage.

Reporting: Business Objects 3.1/3.0, Cognos 8, OBIEE

Education:

• Bachelor of Engineering in Computer Science, JNTU. India.

Certification:

• Teradata 12 Certified Professional

www.teradata.com/Certification

Verification # : ZXE15NBKKNVQ2XB4

Professional Experience

JPMORGAN CHASE

Wilmington DE May 2013 – December

2014

Teradata System DBA

Technical Environment: Teradata database 13.10, Teradata TARA GUI, Net Backup, Teradata appliance 2650,

Teradata Administrator, Teradata View Point, PDCR, Teradata Manager, PMON, Teradata

SQL Assistant, Mload, Fload, Fexp, Tpump, TPT, UNIX/LINUX and Data Mover

Responsibilities:

• Interacted with the business users to gather business requirements.

• Designed semantic data model.

• Created and implemented views and stored procedures basing on the user requirement.

• Created Teradata tables and also the indexes on them.

• Created join indexes in order to prevent the overhead on the tables.

• Largely involved in the performance tuning procedures.

• Largely involved in Data profiling, identifying data issues, cleaning the data and reloading the data.

• Performed Unit, Integration and UAT testing while testing the defects.

• Used Teradata Fastexport in exporting large amounts of data.

• Used Teradata BTEQ for implementing the business logic.

• Used Teradata Multiload for importing data into multiple tables.

• Involved in the design and user documentation.

• Interacted with the end users during the defect meetings and conferences

• Used Teradata Manager collecting facility to setup AMP usage collection, canary query response, spool usage

response etc.

• Worked on capacity planning, reported disk and CPU usage growth reports using Teradata Manager, DBQL, and

Resusage.

• Writing UNIX shell scripts for automating common tasks

• Data Mover Job creation with DataMoverPortlet in Viewpoint.

Sutter Health, Sacremanto, CA October 2011 – March 2013

Teradata Developer

Technical Environment: Teradata database 13.10, Teradata TARA GUI, Net Back, Teradata Administrator,

Teradata View Point, PDCR, Teradata Manager, PMON, Teradata SQL Assistant,Mload,

Fload, Fexp, Tpump, TPT and UNIX/LINUX

Responsibilities:

• Migrated tables from Oracle to Teradata.

• Wrote BTEQ and Mload scripts to load data from Oracle to Teradata.

• Analyzed the dependencies of the existing job on Oracle data mart.

• Used UNIX/Perl scripts to access Teradata & Oracle Data.

• Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.

• Worked on Informatica Power Center tool –Source Analyzer, Warehouse designer, Mapping and Mapplet Designer,

Transformations, Informatica Repository Manager, and Informatica Server Manager.

• Informatica Metadata repository was created using the Repository Manager as a hub for interaction between the

various tools. Security and user management, repository backup was also done using the same tool.

2

• Informatica Designer tools were used to design the source definition, target definition, and transformations to build

mappings.

• Created the mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Router,

Filter, and Update Strategy.

• Server Manager used for creating and maintaining the Sessions. Server Manger Also used to Monitor, edit, schedule,

copy, aborts, and deletes the session.

• Used ETL for efficient mapping and transformation techniques.

• Develop customized programs and scripts using Perl, Unix, SQL and scripting languages as needed.

• Produced documentation and procedures for best practices in Teradata development and administration.

• Used Erwin data Modeler for Data Modeling and using Star Schema, Snowflake Schema, Fact and Dimension

tables, Physical and Logical data modeling for Data Warehousing and Data Mart.

• Involved in database upgrades, TTU client software upgrade.

• Writing UNIX shell scripts for automating common tasks

• Proven team player, effectively collaborate with cross functional teams, hard working and quickly adapt to demands of

high pressure trading and brokering environment.

State Of Michigan-Lansing,MI January 2011- October

2011

Teradata Developer

Technical Environment: Teradata database 13.10, PMON, Teradata SQL Assistant, Mload, Fload, Fexp, Tpump,

TPT, Teradata Administrator, Teradata View Point, PDCR, Teradata

Manager,UNIX/LINUX and Windows NT

Responsibilities:

• Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.

• Writing Teradata SQL queries to join or any modifications in the table.

• Transfer of large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.

• Database-to-Database transfer of data (Minimum transformations) using ETL (Ab Initio).

• Fined tuned the existing mappings and achieved increased performance and reduced load times for faster user query

performance.

• Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.

• Sorted data files using UNIX Shell scripting.

• Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.

• Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on

column property, column value, and referential integrity.

• Acted as a single resource with sole responsibility of Ab Initio – Teradata conversions.

• Written scripts to extract the data from Oracle and load into Teradata.

• Worked on exporting data using Teradata FastExport.

• Written Teradata BTEQ scripts to implement the business logic.

• Hands on with Teradata Queryman to interface with the Teradata.

• Used SQL Profiler for trouble shooting, monitoring, optimizing SQL Server from developers and testers.

• Used UNIX scripts to access Teradata & Oracle Data.

• Developed UNIX shell scripts for data manipulation.

• Involved in writing proactive data audit scripts.

• Involved in writing data quality scripts for new market integration

• Developed complex transformation code for derived duration fields.

• Developed BTEQ scripts to extract data from the detail tables for reporting requirements.

3

Intergraph, India June 2007 - December 2010

Teradata Developer/DBA

Technical Environment: Teradata database V2R5, Teradata Administrator, Teradata Manager, PMON, Index

Wizard, Static Wizard, Visual Explain Teradata SQL Assistant, UNIX and Windows NT

Responsibilities:

• Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.

• Writing Teradata SQL queries to join or any modifications in the table.

• Transfer of large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.

• Extensively used Ab-Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different

Ab Initio components were used effectively to develop and maintain the database.

• Understood the business requirements with extensive interaction with Business analysts and reporting teams, and assisted

in developing the low level design documents.

• Used inquiry and error functions like is_valid, is_error, is_defined and string functions like string_substring,

string_concat, and other string_* functions in developing Ab Initio graphs to perform data validation and data cleansing.

• Implemented a 6- way multifile system in the test environment that is composed of individual files on different nodes that

are partitioned and stored in distributed directories in multifile system.

• Partition Components (Partition by Key, by _Expression, by round Robin) were used to partition the large amount of data

file into multiple data files.

• Architect Database Design, Create, Maintain Data Warehouse, Database Maintenance, Reorganizations and Upgrades.

• Developed Disaster Recovery Plan, Implementation and Business Continuity plan.

• To monitor query run times using Teradata Performance Monitor.

• Worked with developers to convert the functional specification to technical specification which is the DDL generation.

• Developed Shell Scripts for creating /dropping of table and indexes of performance for pre and post session management.

• Worked with developers to identify performance issues of the ODS population and work together to resolve the issues.

• Done analysis which led to impact analysis and resolving them.

• Involved in deciding the Standards and Compliancy for the Database.

• Consolidation of the models to the repository, also consolidation to the main subject area model, identifying the impacts

and correcting them.

4



Contact this candidate