Sign in

Data Information Technology

San Jose, California, United States
May 30, 2017

Contact this candidate


Sravanthi Kotha 408-***-****


Around 7 years of strong experience in design,development of Data warehousing and Business Analytics

Good experience in Health care domain and Finance and Telecom domain.

Self-motivated and goal oriented, and able to operate with a sense of urgency as needed to make deadlines, can take ownership of assignments

Quick learner who can easily adapt to different responsibilities and technologies

Strong analytical, problem solving and communication skills


Operating Systems: Windows XP/7, Unix, iOS

Languages: SQL, Unix Shell Script, Python

Database: Teradata, Oracle, Hadoop

ETL Tools: Informatica Power Center, Teradata Tools & utilities like TPT Load, TPT Export, MLOAD, TPUMP, FAST LOAD, FAST EXPORT,BTEQ, SQL Assistant

BI Tools: Tableau

Other Tools : Autosys, ESP,GIT,SVN

Database Tools : SQL Developer, Teradata Studio, SQL Assistant, SQL Workbench


Requirement gathering and analysis, design, coding, testing for the new projects

Design and development of Informatica interface to induct data from variety transactional applications like pulling data from Oracle, SAP, Flat files to data warehouse. Using different Informatica transformations like Expression, Filter, Joiner, Lookup, Router, Stored Procedure and Update Strategy

Loading data into Teradata data warehouse from file generated by Informatica using Teradata utilities like TPT, TPump, Fastload, Multiload

Data modeling using Power Designer

Design and development of Teradata tables, views, indexes and procedures with optimum query performance. Writing complex yet optimized procedures to build aggregates for reporting matrices as well as derived matrices for raw data for better ad hoc query performance. Designing physical schema and populating reporting tables

Experience in using Teradata SQL Assistant, Volatile Tables, UPI, NUPI, USI, NUSI, Analytical functions, data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, TPT and

Through technical, functional & load testing and coordination for user acceptance and finally implementing changes in production environment

Time to time monitoring system performance and optimization

Performing data replication across multiple Teradata environments to handle failovers

Generated Tableau reports for analytical usage

Formal training in Hadoop and Python technologies


Retail Global Customer Analytics and Marketing

Apple Inc Sunnyvale, CA

Data Warehouse/BI Developer Lead Mar 2015 – Till Date

Worked in different areas as Reseller operations, Marketing and Retail . Main scope of this project is load data into EDW and generates reports for business users which will be helpful for data analysis

Involved in business discussions and gathering requirements

Analysis the requirements and have business meetings to discuss on the requirement gaps.

Designing semantic data model and database objects like table, views and procedures

Developed Preprocess script to cleanse source files

Developed Informatica interfaces to induct data from different source systems like oracle,flat files into EDW

Used Teradata analytical functions,UPI,NUPI,data load and export using Fast Load,Mload,BTEQ, Fast Export TPT .

Writing complex SQL queries in Teradata for ad-hoc analytics

Creating and scheduling Autosys jobs with proper conditions and dependencies

Worked on Adhoc requests like look up files,user queries analysis

Monitoring IO, CPU utilization and skew of scheduled processes and performance tuning

Performed Unit, Integration testing and Support UAT.Involved in CR implementation activities and post validation

Generated Tableau reports for analytical usage

Campaign Analytics

Wells Fargo SanFrancisco, CA

Sr Teradata/ETL Developer /Analyst Nov 2014-Feb2015

Main scope of the project is, analysis and checks the performance of outbound and inbound campaigns. Generate and maintains characteristic library based on the customer response and clicking data for outbound and inbound campaigns

Involved in the meetings with business users to gather campaign details

Developed BTEQ scripts to analyze the inbound campaigns and generate the characteristic library.

Writing complex SQL queries in Teradata for ad-hoc analytics

Created Volatile tables, work tables to store intermediate results and aggregates

Used Teradata analytical functions,UPI,NUPI,data load and export using Fast Load,Mload,BTEQ, Fast Export

Developed Cron jobs to automate the process and monitor it

Monitoring IO, CPU utilization and skew of scheduled processes as well as ad hoc user queries and performance tuning

Generate the tableau reports to analyze the characteristic library and campaign performance

Involved in meetings with business users to identify the top characteristics of each campaign

Marketing Analytics

Cigna Greenwood Village, CO

Teradata/ETL Developer/Analyst Oct 2013-Oct 2014

The main scope of this project is to extract the data from different source systems like MEMBOB, PHARMACY X WALK, MEDICAL PCT etc. and load into Cigna Central Warehouse (CCW). The main purpose of the CCW is to provision the transactional data for analysis and trending.

Active involvement in project workgroup meetings to resolve any ambiguity, incompleteness or incorrectness of requirement

High-level Impact analysis, business background analysis, arriving at basic assumptions and Estimation of work

Involved in Analysis and development Business requirement document and the Design document.

Worked on data models build.

Worked on source connections approvals and build.

Participation in full Software Development Life Cycle (SDLC) of data warehousing projects.

Implemented Aggregate, Filter, Join, and Expression, Lookup, Sequence generator and Update Strategy transformations.

Developed the Reusable ETL components to use in multiple processes.

Write Extensive SQL s for data validation and requirement completion.

Loading Data to the Teradata Data Warehouse Tables from multiple data sources using Teradata utilities Fast load, Mload, BTEQ, and Tpump .

Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.

Performed End to End testing. Also worked closely with Users and helped in accomplishing the UAT successfully.

Customer Billing and Collections

Convergys Hyderabad, India

Teradata/ETL Developer Jun 2006-Oct 2008

VIVO is one of the major wireless companies in Brazil. The company has huge volumes of data coming from different sources, which was generated from multiple and cross platform applications. The Project involved design and development of data mart for the Billing and Collections department.

Involved in database set up like connections for the sources and targets.

Experience in documenting Design specs, Unit test plans and Deployment plans.

Responsible for verification of daily loads.

Teradata Fast load, Mload and BTEQ utilities were used to loading from Sources to Targets.

Created mappings and workflows for data loading.

Implemented Variables and Parameters in the mappings

Wrote UNIX shell scripts to automate the process.

Involved in code review, testing and quality assurance of data

Worked with various active & passive transformations.


Received “SEE Award” from the client

Received two “POWER OF ONE” cards

Teradata Certified SQL Specialist V2R5


Jawaharlal Nehru Technological University, India 2002 - 2006

Bachelor of Engineering in Information Technology.

Contact this candidate