Resume

Sign in

ETL Developer

Location:
Los Angeles, CA
Posted:
May 21, 2020

Contact this candidate

Resume:

Meghana Gade

Mobile: 408-***-****

Email: addc15@r.postjobfree.com

PROFESSIONAL SUMMARY:

Progressive software professional with over 7 years of Data Warehousing experience.

Over 5+ years of experience on Teradata development and OLAP operation with Teradata database using Teradata TD15/TD13/TD12, TPT, BTEQ, Teradata SQL assistant, Multi Load, Fast Load and Fast Export.

Experience in Data Warehousing ETL experience of using Informatica 9.1/10 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.

Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.

Proficient in Performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata.

Experienced with Teradata Parallel Transporter and scripting at Windows/Unix servers and Created ETL dependency scripts and stored procedures.

Experience in using the Pushdown optimization, CDC techniques, Partition and implemented Slowly Changing dimensions Type 1, Type 2 methodology for accessing the full history of accounts and transaction information.

Experience with development, testing, debugging, implementation, documentation and production support.

Well versed with all stages of Software Development Life Cycle (SDLC) i.e. Requirement gathering & analyzing, Design/redesign, Implementation and Testing.

Strong hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export, Visual Explain) an also SQL assistant experience.

Written FASTLOAD scripts to load the large size tables from SQL Server to Teradata EDW.

Expertise in different data sources ranging from flat files, MS SQL Server, MS Access, MS Excel, SQL on Windows platforms.

Experience in Performance tuning for the scripts and queries migrated from Teradata to Vertica.

Experience in ETL process to extract, transform, and load functionality across systems using Informatica and Teradata utilities.

Strong experience in mentoring team members and writing documentation with excellent communication and interpersonal skills.

Ambitious, self-motivated, ability to work independently as well as in teams, possess multi-tasking skills, results oriented engineering professional.

Education:

Master’s in Information technology, Delaware.

Technical Skills:

RDBMS:

Teradata 12/13/15, Oracle 10g/11g, MS SQL Server.

Languages

Java, C#, Shell scripting

Operating System

Windows XP, Windows7,10 and UNIX/Linux.

Tools & Utilities

Teradata SQL Assistant, Putty, Unix/Linux, BTEQ, MLOAD,

FASTLOAD, FAST EXPORT, Shell scripting, Informatica 9& 10, Power Center Cloudera Hadoop, Hive, Sqoop, Vertica, Workflow manager, Monitor and designer.

Version Control Tools

TFS, SVN, VSS, Git

Other

Microsoft Office (word, excel, power point, outlook).

PROFESSIONAL EXPERIENCE:

Client: AT&T, El Segundo, CA Mar‘2018 – Till Date

Role: ETL Developer –Teradata/Vertica/Hadoop

Responsibilities:

Collaborated closely with client managers, architects, DBA’s and business users.

Involved in complete SDLC Agile process and scrum meetings.

Developed scripts in UNIX for business logic that includes file processing, data loading and data comparison.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Coordinated and monitored work-items across teams and clients in both onsite and provided project related directions to offshore team.

Monitoring of application process that are scheduled using Autosys and Tivoli jobs.

Migrated database from Teradata to Vertica and developed queries for Vertica database.

Unit and integration testing of the application in dev and QA environment.

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

VSQL and Shell script creation to migrate data from different sources (Teradata, Oracle, SQL Server) to Vertica DWS and Migrated the code into QA (Testing) and supported QA team and UAT (User).

Designed and developed several mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator, etc.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Developed and used SQl queries to validate data in both source and target databases.

Extracted from OLTP system to dump files and upload to data warehouse using Fast Load, Multi Load.

Wrote Fast Export jobs in Teradata to extract large volume of data.

Created data load scripts in Vertica to load the data extracted from Teradata database.

Conducted code reviews developed by my teammates before moving the code into QA.

Debugged the problems when migrating from Source to Teradata (Conversion of data types, Views, Synonyms, tables etc.).

Written several Teradata BTEQ scripts to implement the business logic and Developed the queries using Teradata SQL Assistant.

Created External tables in Hive and loaded data from HDFS.

Perform complex HiveQL queries on Hive tables for data profiling and reporting

Performed requirement and impact analysis for building the base data in Vertica.

Analyzed existing Teradata Database and performed data modelling in Vertica.

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

Developed mappings to load into staging tables and then to Dimensions and Facts, used existing ETL standards to develop these mappings.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Analyzed ETL jobs in Teradata and converted them to equivalent Unix scripts with Vertica queries.

Transferring data between production and development/Test environment.

Coordinate with Production support team in code deployments.

Environment: Teradata 15/16, Vertica, Unix, Hive, Db Visualizer, Workflow Manager, Workflow Monitor, Teradata SQL Assistant, SSMS,BTEQ, WinScp, FastLoad, MultiLoad, Fast Export, Tivoli, Autosys, Tpump, Viewpoint, PL/SQL, UNIX, Autosys, SVN, Code Cloud, Informatica Power center 10, windows 7/10.

Client: Goldman Sachs, New York Oct’16 – Feb’18

Role: ETL Developer

Responsibilities:

Worked within IT team to follow the full Software lifecycle development process working with development and production support team.

Used Teradata utilities fast load, MultiLoad, tpump to load data.

Wrote BTEQ scripts to transform data and Fast export scripts to export data.

Worked with DBAs to tune the performance of the applications and Backups.

Involved in developing Unit Test cases for the developed mappings.

Develop scripts to automate the execution of ETL using shell scripts under Unix environment.

Extensively worked in the performance tuning of transformations, Sources, Sessions, Mappings and Targets.

Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Used ETL methodology for supporting data extractions, transformation and loading process.

Involved in the logical and physical design of the database and creation of the database objects.

Coded stored procedures, functions, and ref cursors to store, retrieve and manipulate data from database.

Experience in automating the ETL process through different scheduling tools like autosys & Tivoli

Involved in the performance tuning of the application through creation of necessary indexes.

Extensively used explain plans to get the statistical data for the performance tuning of the Oracle Transact SQL queries and PL/SQL for stored procedures.

Responsible for the migration of data from Extract Tables to the flat files and relational tables.

Worked with UNIX shell Scripts.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse and Created mapplets to use in different mappings.

Performed Unit testing, Integration testing and generated various Test Cases.

Performed Data analysis and Data validations.

Coordinating tasks and issues with Project Manager and Client on daily basis.

Troubleshooting slow running queries and stored procedures.

Environment: Teradata 13.10, BTEQ/BTEQ Win, TPT, Fast Load, MLoad, Fast Export, and Teradata SQL assistant, Informatica Power Center 9.1.0, Windows 7.

Client: PNC Bank, Pittsburgh, PA Aug ‘15– Sep’16

Role: ETL Developer

Responsibilities:

Involved in Data Extraction, Transformation and Loading from source systems.

Performed Data analysis and prepared the Physical data base based on the requirements.

Responsible for populating warehouse-staging tables.

Responsible for capacity planning and performance tuning.

Developed complex mappings using multiple sources and targets in different databases, flat files.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Developed BTEQ scripts for Teradata and automated Workflows and BTEQ scripts

Responsible for tuning the performances of Informatica mappings and Teradata BTEQ scripts.

Worked with DBAs to tune the performance of the applications and Backups.

Worked on exporting data to flat files using Teradata Fast EXPORT.

Query optimization (explain plans, collect statistics, Primary and Secondary indexes)

Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

Created and Configured Workflows and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.

Written several Teradata BTEQ scripts to implement the business logic.

Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.

Integrate high volumes of data with significantly reduced impact to the production system.

Changes are captured as they happen, allowing up-to-the-minute data to flow between systems to support real-time reporting, business intelligence, and synchronization of applications by CDC

Used Repository Server Administration Console to create and backup Repositories.

Writing UNIX Shell Scripts for processing/cleansing incoming text files.

Performed Unit testing, Integration testing and generated various Test Cases.

Coordinating tasks and issues with Project Manager and Client on daily basis.

Environment: Teradata V12, SQL, BTEQ/BTEQWin, FastLoad, MultiLoad, FastExport, Informatica, Teradata SQL assistant 13, Oracle 10g, PL/SQL, SQL Loader, UNIX,Visual Source Safe, Windows XP.

Client: Enterprise Holdings, St. Louis, MO Feb ‘14- July’15

Role: ETL Developer

Responsibilities:

Developed BTEQ scripts for Teradata.

Understanding the specification and analyzed data according to client requirement.

Writing UNIX Shell Scripts for processing/cleansing incoming text files.

Performance tuning, including collecting statistics, analyzing explains& determining which tables needed statistics.

Extracted from OLTP system to dump files and upload to data warehouse using FastLoad, MultiLoad and TPump.

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools

Exported data to flat files from database using Fast Export.

Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, Informatica.

Created Primary index, secondary index and PPI.

Created Teradata External loader connections such as Upsert and Update, while loading data into the target tables in Teradata Database.

Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Written several Teradata BTEQ scripts to implement the business logic.

Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.

Analyzed and provided solutions for all database related issues.

Involved in understanding Business and data needs and analyze multiple data sources and document data mapping to meet those needs.

Proven client-facing skills, outstanding project success record, communication and collaboration excellence

Developing the queries using Teradata SQL Assistant.

Used SVN as a version tool.

Worked on Tivoli Work Scheduler.

Environment: Teradata V2R12, Teradata SQL Assistant, BTEQ/BTEQ Win, FastLoad, MultiLoad, Fast Export, Tpump, Visual Explain, Oracle 8i, PL/SQL, UNIX, SVN.

Client: Maruti, Hyderabad, India. May’13 – Nov‘13

Role: Software Developer

Responsibilities:

Collaborated closely with Lead developers, architects, and DBA’s.

Developing the queries using Teradata SQL Assistant.

Created tables as required by the users for most frequently used data and written scripts for those tables.

Involved in performance tuning of the SQL queries and the reports.

Helping in maintenance work related to reports.

Extracted from OLTP system to dump files and upload to data warehouse using FastLoad, MultiLoad and TPump.

Debugged the problems when migrating from Oracle to Teradata (Conversion of data types, Views, Synonyms, tables etc.).

Used SQL Assistant to querying Teradata tables.

Created and supported Development and Testing Teradata databases and proposed backup and recovery strategies to the client.

Interacted with mainframe server running on MVS to extract the DB2 data available in data sets using JCL, which includes Teradata SQL statements and calls Teradata utilities like FastExport, FastLoad, MultiLoad and BTEQ.

Prepared unit and System integration test cases.

Involved in setting up of data for UAT environment and supporting.

Responsible for Collecting Statics on FACT tables.

Query optimization - explain plans, collect statistics, data distribution across AMPs, Primary and Secondary indexes, locking etc.

Environment: Teradata V2R6, Ab Initio, Teradata SQL Assistant, BTEQ, FastLoad, MultiLoad, FastExport, Tpump, Oracle 10g, UNIX, Windows XP/2000.



Contact this candidate