Post Job Free
Sign in

Administration Teradata AWS Redshift Snowflake ETL

Location:
Waukegan, IL
Posted:
September 24, 2021

Contact this candidate

Resume:

Ravi Marada

Sr. Teradata DBA

********@*****.***

913-***-**** (c)

Experience Summary:

Having 10 plus Years of IT experience in the Administration, Analysis, Design, Development, Testing and Implementation of business application systems in Data warehousing for Industrial sectors ranging from Banking, Retail, Communication, and Insurance industries to HealthCare.

Over 8+ years of experience in Teradata, ETL methodologies for supporting data extraction, transformations and loading processing through Teradata, IIS, SSIS and DM Express.

Seven plus (7+) years of Teradata Administration, Teradata Administrator, Teradata SQL Assistant and data load/export utilities of Teradata like BTEQ, Fast Load, Multi Load, Fast Export and Unix shell scripting.

Proficient in Performance Tuning through PI choice, statistics, query re-writing, skew reduction, compression, and different types of partitioning including Columnar and debugging of existing ETL process.

Leveraging Teradata AWS S3 connection to deep archive older data thereby reduce storage cost on Teradata Nodes.

Providing technical solutions and guidance for Business Analysts and Functional requirements, perform Impact analysis on the system, creating data models for projects, designing of logical models and creating development environment.

Extensive Administration experience using Teradata Administrator, Teradata Viewpoint, Teradata SQL Assistant, Teradata Studio, DBS Control, PDCR, Teradata BAR processes of Backup, Archive and Restore.

Articulate business requirements into technical details through Data Modelling with different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.

Design and Maintain database systems according to Analytical and Reporting requirements using Microstrategy, Congos, QlikSense and QlikVeiw.

Integrating Teradata Hadoop HDFS, and manage Hadoop ecosystem like Hive, HBase, and Sqoop through Cloudera Manager.

Worked in all stages of software development cycle, namely administration, design, specification, coding, debugging, testing (test plan and test execution), documentation and maintenance of the programs.

Follow Agile methodology, document and maintain project daily activities with timelines through Jira and Confluence.

Strong Communication, Presentation and Interpersonal skills with excellent problem solving capabilities.

Certifications: Teradata Certified Professional

Informatica Power Center 8.x Mapping Design

Education: Masters in Engineering, Gannon University, Erie, PA Batch 2007 - 2009

Bachelors in Engineering, Andhra University, India, Batch 2002 - 2006.

Technical Skills:

Reporting Tools

Microstrategy 9.4.1, Cognos, OBIEE 11.1.x, Qlik Sense 13.32.84

Databases

Teradata 16.20, 15.1, 14.1, 13.1, V2 R12, AWS REDSHIFT, SQL Server 2005/2008/2008R2/2012, Oracle 11g, 10g,

Tools

Cloudera Hadoop, Cloudera Manager, Informatica Power Center 10, 9.5 / 8.1, SQL Developer, TOAD, HP Quality Center 10, DM Express 7.5, SSIS,, CA Autosys, Crontab,Veritas NetBackup, HP ALM Tool

Data Modeling Tool

E R Studio 16, Erwin 9.5/4.0, MS Visio, Enterprise Architect 4.5

Operating Systems

Windows, Linux, Unix Shell Scripting

Cloud Computing

AWS, S3, IAM, EC2, EMR. Snowflake Virtual Warehouse

Professional Experience:

Client : AbbVie Inc. Waukegan IL Feb 2018 - Present

Position : Sr. Teradata DBA

Configure Teradata parameters, create prototype designs against logical data models, and defines data repository requirements, data dictionaries and warehousing requirements for SMDW of US Commercial and BIOPS systems.

Support application development timelines by reviewing and assisting in the implementation of designs and/or incremental changes to Teradata definitions.

Provide technical guidance for integration, testing, design, development, planning of new major, large scale, production systems/databases.

Working with various Loading utilities of Teradata such as Bteq, Fast Load, Multiload and Fast Export Scripts

Perform daily maintenance with Teradata tools and utilities such as Viewpoint, PDCR, DBQL monitoring, performance analysis using PDCR tool kit, troubleshooting, problem resolution, and user assistance for Teradata Platform and Teradata Client Setup.

Work with Backup and Restore activities using Data Stream Architecture or DSA.

Resolve complex Teradata performance issues, capacity issues, set replication policies using Veritas Netbackup, and other distributed data issues based on PI and AMP distribution.

Manage and maintain all Teradata production and non-production databases and refresh or restore data from one environment to other using DSA.

Responsible for Teradata standards and design of physical data storage, maintenance, access and implement security administration through Teradata Administrator or Teradata Studio.

Plan and maintain workloads through TASM or Teradata Workload management through Workload Designer, Workload Analyzer for low, high, medium or tactical workloads.

Responsible for space management, back-up, monitoring, and maintaining ongoing performance of Teradata platform.

Participate as Subject Matter Expert (SME) as required for design reviews and new technology.

Perform capacity planning including allocating system storage and planning future storage requirements.

Plan, co-ordinate system and database upgrade and patch activities with Teradata service professionals and field engineers

Use Teradata ticketing system of Teradata Support for follow up on issue resolution, incidents and database hardware maintenance activities such as upgrades, patches and change controls.

Client : CVS Health Care Dallas TX Mar 17 –Jan 18

Position : Sr. Teradata DBA

Design, create and regular tuning of physical database objects (tables, views, store procedures and indexes).

Performing DBA activities include creation of users, spool, temporary, permanent space, checking the tables skewing, table compression, troubleshooting the failure jobs

Composed Automatic jobs for day to day admin activities to report on Objects access, and space usage through unix shell scripts and automat them in crontabs.

Resolve issues in ticketing queues in Service Now (SNOW) to meet SLA’s on daily basis for different insurance clients.

Optimize performance on database and applications using EXPLAIN, collect statistics, index selection, workload balancing, increase sessions, etc.

Working on Object and Data Migration of legacy applications of Zeus and CDW from Oracle to Teradata.

Responsible for change control and Release of Enterprise data models for data warehouse subject areas such as Claims, Member and Eligibility.

Provide documentation to offshore DBA's to monitor system during their shifts and provide 24/7 On-Call support.

Working with Database Query Log (DBQL) Query Performance Tuning, Teradata Active System Management (TASM).

Client : New York and Company New York NY May 16 –Dec 16

Position : Sr. Teradata DBA and Data Modeler

Roles & Responsibilities:

•Worked on building new Data Warehouse for New York and Company retail domain.

•Worked on Developing KPI Dashboard project which serves as one of the cost reduction areas by eliminating wastage in sample order.

•Worked on Troubleshooting day-day Teradata Production issues, regarding space issues, spool issues to meet end client SLA’s through JIRA Ticketing queue.

•Performance tuning on complex queries with efficiency of Muti-table joins, Join Index, PPI, Partitioning PPI, Secondary Index mechanism.

•Worked on Macros creation for automatic user creation along with space and roles allocation for new users.

•Worked and monitored Teradata systems while Upgrading Teradata 14.0 to Teradata 15.0 successfully.

•Worked on POC for reports generation using basic Python scripting.

•Capacity planning: Properly identify required hardware, software, database configuration/architecture necessary to support application need/growth.

•Oncall support: 24/7 database support off-site & onsite. (Rotating on-call Bi-weekly).

Client : Wells Fargo Fremont CA Mar 15 –Apr 16

Position : Teradata DBA

Roles & Responsibilities:

Working with Regulatory Data Reporting project for banking domain.

Provided Design, Created and Regular Tuning of Physical database objects of Regulatory Data Repository tables, views, indexes to support OBIEE Reports running of Teradata database.

Design, create and regular tuning of physical database objects (tables, views, indexes) to support normalized and dimensional models.

Proactive monitoring Teradata database space utilization and regular clean-up of non-usage objects to release space.

Working on Teradata Viewpoint (Database Console: ShowLocks, Query Sessions, Query configuration) (Monitor: sessions, nodes, vprocs etc.) also Total space usage.

Worked with TASM workloads for Infolease and Midas applications of Wells fargo TD EDW.

Testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.

Optimizing Query Performance, Session Performance and Reliability.

Develop BTEQ scripts to transform data from the source and populate tables in IMDM Data mart.

Use HP Quality Center for defect tracking and reporting. Weekly risks / issues meetings with the stakeholders.

Develop Fastload scripts to load high volumes of data from flat file to tables in Teradata.

Working with push down optimization techniques to improve performance of mappings such as source, target and full Push down optimizations.

Project development estimations to business and upon agreement with business delivered project accordingly.

Client : Bancvue Ltd Austin TX Jan 13 – Feb 15

Position : Teradata DBA

Roles & Responsibilities:

RateWatch Project database tables are designed, create and tuning of database objects such as EDW tables, views, indexes. Also created Join Indexes for faster retrieval of data.

Optimizing Query Performance, Session Performance and Reliability.

Extensive use of Informatica Tools like Designer in RCX project of transactional data for creating mappings, develop Workflow Manager and monitor the mapping runs in Workflow Monitor.

Design of process oriented UNIX script and ETL processes for loading data into data warehouse.

Teradata performance tuning via Explain plan, PPI, Indices, collect statistics or rewriting of the code.

Backup, archive and restore Teradata objects ARCMAIN with TARAGUI and automate the jobs using crontab.

Working with push down optimization techniques to improve performance of mappings such as source, target and full Push down optimizations.

Requirements analysis, data assessment, business process reengineering.

Coding using Teradata Analytical functions, BTEQ SQL of TERADATA, write UNIX scripts to validate, format and execute the SQLs on UNIX environment

Interact with business users and source system IT teams to define, agree and document incoming data mapping specifications.

Index maintenance & analysis.

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.

Client : LORD AND TAYLOR St Louis MO Sep 11 – Dec 12

Position : Teradata ETL Consultant

Roles & Responsibilities:

Developed Informatica Mappings and Workflows for E_Comm Data of Lord and Taylor.

Loaded Flat files for Cross Selling Product data using Informatica power center.

Optimized high volume tables (Including collection tables) in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.

Provide ongoing support by developing processes and executing object security and access privilege setup and active performance monitoring.

Expertise in using Visual Explain, Index wizard, Statistics wizard to tune the bad queries and analyze the plans implement the recommendations to gain the performance.

Designed delta load on all fact tables.

Resolve and work on issues across multiple functional areas. Effectively monitor and take action to ensure coordination and effectiveness of all components and activities and decide on issues requiring escalation.

Help create and maintain operational documentation to support the solution in a production environment.

Client : NIKE Beaverton OR Aug 10 -Aug 11

Position : Teradata Consultant

Roles & Responsibilities:

Used loader scripts like BTEQ, MLoad, FLoad and Fast Export for POS – Point of Sale project.

Extensively used Stored Procedures when the loading environment is same in TO-Date project where loading is weekly procedures and monthly procedures.

Used Informatica (ETL) mappings to load the data to Data warehouse Systems.

Created parameter files for the workflows and mappings for various applications to use different environments like Development, Model Office (Testing) and Production.

Used Workflow Manager for Creating, Validating, Testing and running the workflows and Sessions and scheduling them to run at specified time.

Implemented performance tuning with the Transformations, Sessions and on the Source Databases to load the data efficiently in to the target

Developed Fast Load, Multi Load and BTEQ scripts to load data from various data sources and legacy systems to Teradata

Create and Maintain Users Profile Definition to determine the Performance

Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user acceptance testing and loading history data into Teradata.

Client : CMS – QSSI Baltimore MD Apr 09 – Jun 10

Position : ETL Developer

Roles & Responsibilities:

Participate in system related walkthroughs.

Perform analysis necessary to develop specifications.

Analyze new data sources for inclusion into the enterprise data warehouse.

Working with Federal Leads brainstorming impact analysis and Partner staff to identify ETL requirements.

Working on Interface Control Documents (ICDs) creation to reflect functional requirements.

Create ETL Functional specifications that document source to target mappings.

Maintain existing Informatica ETL streams for Parts A, B, and DME claim streams running on a Teradata database.

Maintain existing UNIX/BTEQ scripts for the Part D ETL stream.

Create and execute test plans for newly developed ETL streams.

Implementing Informatica ETL on CMS standards for Teradata database with fact and dimensional Data warehouse model and referential integrity.



Contact this candidate