Post Job Free
Sign in

Teradata/Informatica developer

Location:
Atlanta, GA
Posted:
January 18, 2017

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

Over *+ years of experience in IT, with proficiency as Teradata/Informatica developer and strong expertise in SQL queries, stored procedures, Teradata Macros.

Expertise in loading and extracting data from Teradata using various utilities like Teradata SQL Assistant, BTEQ, MULTILOAD, FASTLOAD, FASTEXPORT, TPUMP and Teradata Parallel Transporter.

Experienced in writing Teradata SQL scripts for transforming business rules.

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.

Expertise in handling Terabytes of databases and effectively running the queries against such databases with no performance issues.

Expertise in developing summaries and reports using Advanced Teradata SQL according to business needs.

Expertise in understanding the business requirements and converting them into Technical specifications and reverse engineering the existing systems or processes for documenting and bug fixing.

Experience in performance tuning of large scale databases using Views, Indexing mechanism, Global Temporary tables, and Volatile tables in TERADATA and moving the Oracle table’s data to Teradata using Teradata Parallel Transporter (TPT).

Excellent experience in ETL Tools like Informatica and Informatica BDE and on implementing Slowly Changing Dimensions (SCD).

Expertise in developing the Batch control processes in UNIX.

Involved in generating Mload and Tpump scripts to load the data into Teradata table.

Extensively worked on Informatica Designer components, Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.

Expertise in Oracle Stored Procedures, Triggers, Index, Table Partitions and experienced in Loading data like Flat Files, XML Files, Oracle, DB2, SQL Server into Data Warehouse/Data Marts using Informatica.

Created mapping documents and work flows and data dictionaries.

Experienced in developing business reports by writing complex SQL queries using views, macros, volatile and global temporary tables.

Involved in creating Hive Tables, loading with data and writing Hive queries, which will invoke and run Mapreduce jobs in the backend.

Good knowledge on major components in Hadoop Ecosystem like Hadoop, HDFS, HIVE, PIG,.

Quick adaptability to new technologies and zeal to improve technical skills.

Good analytical, programming, problem solving and troubleshooting skills.

TECHNICAL SKILLS

Operating Systems

Unisx, Windows

Database

Teradata, SQL, Oracle.

Big Data

HDFS, HIVE, Hbase.

Languages

Teradata SQL, PL/SQL, Shell Scripting

Teradata Utilities

Teradata - Mload, FastLoad, BTEQ, Fast Export, Tpump, and TPT

Teradata Tools

Teradata SQL Assistant, Teradata Manager

ETL tools

Informatica Power centre 9.0, 9.1, 9.5

Process/Methodologies

Waterfall Methodology, Agile Methodology, FSLDM

PROFESSIONAL EXPERIENCE

AT&T, Atlanta, GA Sept 2016-Till Date

Teradata/Hadoop Developer

DPV (Data Patterns for visitors)

Responsibilities:

•Data ingestion from Teradata to Hadoop (Sqoop imports). Perform validations and consolidations for the imported data.

•Ingested data sets from different DBs and Servers using Sqoop Import tool and MFT (Managed file transfer) Inbound process.

•Responsible for coordinating with the Business system Analysts in the requirement gathering process and creating technical specific document. Gathering requirements from the various systems business users.

•Execute the jobs to load the data in to DPV portal.

•Create report by brand and store. including the number of records input, number records got loaded, number of records dropped, each demography and feature validations and Quality check on the results and report.

•As part of support, responsible for troubleshooting of Map Reduce Jobs, Pig Jobs, Hive

•Created Migration Process to productionize the developed code and standardized the Build to Run Document to hand over the code to run team for production support

•Developed ETL processes to load data in to HDFS using Informatica BDE from Teradata and other Oracle databases.

•Developed shell scripts to loading Flat files from various sources in to HDFS.

•Developed Map Reduce Jobs using Java, Oozie and Spark to validate and transform data and to load in to Hbase.

•Created BTEQ scripts with data transformations for loading the base tables. Worked on optimizing and tuning the Teradata SQL to improve the performance of batch and response time of data for users.

•Fast Export utility to extract large volume of data and send files to downstream applications.

•Provided performance tuning and physical and logical database design support in projects for Teradata systems and

•managed user level access rights through the roles.

•Preparation of Test data for Unit testing and data validation tests to confirm the transformation logic.

•Created partitions, bucketing across state in Hive to handle structured data.

•Implemented Dash boards that handle HiveQL queries internally like Aggregation functions, basic hive operations, and different kind of join operations.

•Implemented business logic based on state in Hive using Generic UDF's. Used HBase-Hive integration.

•Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.

•Knowledge of managing and reviewing Hadoop Log files, Responsible for analyzing multi-platform applications using python.

Environment:

Cloudera CDH 5.4.10, HDFS, Hive, Map Reduce, Hbase, Oozie, Zookeeper, Sqoop, Teradata SQL Assistant, MLOAD, FASTLOAD, BTEQ, TPUMP, FASTEXPORT, Oracle,UNIX Shell, QlikView.

Charles Schwab Corporation, GA Nov 2015 – Aug 2016

Teradata Developer

The Charles Schwab Corporation, is a savings and loan holding company. Through its subsidiaries, the Company engages in wealth management, securities brokerage, banking, money management and financial advisory services. The Company provides financial services to individuals and institutional clients through two segments: Investor Services and Adviser Services.

Responsibilities:

•Involved in Requirement gathering, business Analysis, Design, and Development, testing and implementation of business rules.

•Translate customer requirements into formal requirements and design documents.

•Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.

•Writing MultiLoad scripts, FastLoad and Bteq scripts for loading the data into stage tables and then process into BID.

•Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad and MultiLoad.

•Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

•Developed mappings in Informatica to load the data from various sources using different transformations like Look up (connected and unconnected), Normalizer, Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank and Router transformations.

•Developing and reviewing Detail Design Document and Technical specification docs for end to end ETL process flow for each source systems.

•Involved in Unit Testing and Preparing test cases.

•Modification of views on Databases, Performance Tuning and Workload Management.

•Maintenance of Access Rights and Role Rights, Priority Scheduling, Dynamic Workload Manager, Database Query Log, Database Administrator, Partitioned Primary Index (PPI), Multi-Value Compression Analysis, Usage Collections and Reporting of Re-Usage, Amp Usage, Security Administration Setup etc. and leading a team of developers for workaround with different users for different, complicated and technical issues.

•Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BTEQ scripts using Teradata.

•Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.

•Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.

•Used extensively Derived Tables, Volatile Table and GTT tables in many of the ETL scripts.

•Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated subqueries, usage of Hash functions, etc…

Environment:

Teradata 14, Teradata SQL Assistant, UNIX, Informatica 9x, BTEQ, MLOAD, TPUMP, FASTLOAD, FASTEXPORT, TPT.

Teradata Developer Aug 2014 – Nov 2015

IBM India Pvt ltd.

DBS (The Development Bank of Singapore) is a Singaporean multinational banking and financial services company. The main aim of this project is to load data coming from different source system like Mainframes Db2, open system etc. Based on user requirement we develop scripts (Bteq, M load, TPT) and load the data into target date mart. We used Informatica to extract the required source data from various sources like Relational files and validate the data and load into the Relational database (Staging area) Teradata and Transform the data accordingly and load into the centralized warehouse by using the Teradata Utilities.

Responsibilities:

•Responsible for coordinating with the Business system Analysts in the requirement gathering process and creating technical specific document. Gathering requirements from the various systems business users.

•Fixed issues with the existing Fast Load/ Multi Load Scripts in for smooth loading of data in the warehouse more effectively.

•Worked on loading of data from several flat files sources to Staging using MLOAD, FLOAD.

•Created Bteq scripts with data transformations for loading the base tables.

•Worked on optimizing and tuning the Teradata SQL to improve the performance of batch and response time of data for users.

•Fast Export utility to extract large volume of data and send files to downstream applications.

•Provided performance tuning and physical and logical database design support in projects for Teradata systems and managed user level access rights through the roles.

•Created BTEQ scripts to load data from Teradata Staging area to Teradata

•Performed tuning of Queries for optimum performance.

•Preparation of Test data for Unit testing and data validation tests to confirm the transformation logic

•Tuned various queries by COLLECT STATISTICS on columns in the WHERE and JOIN expressions.

•Performance tuning for Teradata SQL statements using Teradata EXPLAIN Collected statistics periodically on tables to improve system performance.

•Instrumental in Team discussions, Mentoring and Knowledge Transfer.

•Responsible for Implementation & Post Implementation support

•Documentation of scripts, specifications and other processes.

Environment:

Teradata 14, Teradata SQL Assistant, Informatica, MLOAD, FASTLOAD, BTEQ, TPUMP, UNIX Shell Scripting, Windows XP.

Accel Frontline, India Nov 2012 - Jul 2014

Teradata Developer

The project involved in migrating data from source systems to the EDW (Enterprise Data Warehouse). This process includes extraction of data from source systems, applying transformations and loading the data after the query tuning and performance check. These Processes are scheduled to run daily, weekly or monthly. Teradata SQL and Client Utilities have played a significant role in migrating the data to Data warehouse and achieving the expected gains.

Responsibilities:

•Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

•Development of scripts for loading the data into the base tables in EDW using Fastload, Multiload and BTEQ utilities of Teradata.

•Writing MultiLoad scripts, FastLoad and Bteq scripts for loading the data into stage tables and then process into BID.

•Developed Scripts to load the data from source to staging and staging area to target tables using different load utilities like Bteq, FastLoad and MultiLoad.

•Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.

•Developed mappings in Informatica to load the data from various sources using different transformations like Look up (connected and unconnected), Normalizer, Expression, Aggregator, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Rank and Router transformations.

•Used Informatica designer to create complex mappings using different transformations to move data to a Data Warehouse.

•Developed UNIX shell scripts to run batch jobs and loads into production.

•Involved in Unit Testing and Preparing test cases.

•Involved in Peer Reviews.

•Modification of views on Databases, Performance Tuning and Workload Management.

•Reviewed the SQL for missing joins & join constraints, data format issues, mis-matched aliases, casting errors.

•Query Analysis using Explain for unnecessary product joins, confidence factor, join type, order in which the tables are joined.

•Collected Multi-Column Statistics on all the non-indexed columns used during the join operations & all columns used in the residual conditions.

•Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated subqueries, usage of Hash functions, etc…

Environment:

Teradata V14, SQL Assistant, Informatica, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq, TPT), Control-M, UNIX, FTP.

Infinite Computer Solutions, India Jan 2011 – Oct 2012

Teradata Developer

Responsibilities:

•Involved in Complete Software Development Life Cycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.

•Used Teradata utilities FastLoad, MultiLoad, TPump to load data

•Wrote BTEQ scripts to transform data.

•Created Semantic views on Teradata database

•Worked on Teradata parallel transport (TPT) to load data from databases and files to Teradata.

•Wrote views based on user and/or reporting requirements.

•Wrote Teradata Macros and used various Teradata analytic functions.

•Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.

•Performance tuned and optimized various complex SQL queries.

•Used Informatica Power Center 9.0 for Extraction, Transformation and Loading data from heterogeneous source systems into the target database.

•Analysis of Source, Requirement, existing OLTP system and Identification of required dimensions and facts from the Database.

•Developed standards and procedures for transformation of data as it moves from source systems to the data warehouse.

•Created complex mappings in Powercenter Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner, and Stored procedure transformations.

•Involved in performance and tuning of the ETL processes.

•Coordinated with the business analysts and developers to discuss issues in interpreting the requirements

Environment:

Teradata R12/R13, Teradata SQL Assistant, SQL, Informatica 9x, Outlook, Putty, MLOAD, TPUMP, FASTLOAD, FASTEXPORT, TPT.

Videocon International, India Aug 2008– Dec 2010

Teradata Developer

Responsibilities:

•Analyzing and understanding the end user requirements and business rules.

•Identify all potential issues during the requirement understanding phase and to describe actions to address those issues.

•Done the impact analysis in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.

•Identify the database objects like PL/SQL Procedures, tables, and views, which will be impacted as part of requirement.

•Identify scripts like Bteq, MLoad, FLoad and TPT, which will be impacted as part of particular project requirement.

•Coordinating and delegating the work to other team members.

•Responsible for Technical implementation of the change request solution.

•Supporting enterprise data warehouse and ETL development activities.

•Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster user query performance.

Environment:

Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq), Teradata SQL

EDUCATION:

Bachelor of Technology in Computer Science and Engineering.

CERTIFICATION:

Teradata 14 Certified Professional.



Contact this candidate