Post Job Free

Resume

Sign in

Data Developer

Location:
Parsippany-Troy Hills, NJ
Posted:
July 22, 2019

Contact this candidate

Resume:

SIMHACHALAM MALLEDI

M: 980-***-****

Email: ac9vpw@r.postjobfree.com

SUMMARY:

Certified Teradata 12 Technical specialist with 11+ years of IT experience in Implementation of Data Warehousing and BI projects.

using Teradata utilities, UNIX, Teradata SQL Assistant 13.0, COBOL, JCL, CICS, DB2, VSAM, SQL, TSO/ISPF, and mainframe tools like ENDEVOR, File-AID, Xpeditor, Platinum, Easytreive, SPUFI, QMF, NDM, SDSF, CA7, Control M.

Extensive experience in Business Analysis, Application Design, Development, Implementation for Banking and Financial Services.

Worked in migration of Teradata 12 to Teradata 13.

Worked in migration of Teradata 13.10 to Teradata 14.

Strong hands on experience using Teradata utilities (BTEQ, Fast Load, MultiLoad, Fast Export, Tpump)

Conducting code walk-through and reviewing internal and external quality assurance in the applications, debugging the defects identified and fixing them, comparing the test results with production results to make sure the changes are effective.

Proficient in developing strategies for extracting, transforming and loading using Informatica Power Center 9.1/10.2

Experience in designing and developing stored procedures, functions, constructing tables, views, indexes, triggers and user defined data types

Implemented data strategies, build data flows and developed conceptual, logical and physical data models to support new and existing projects.

Extensive experience on End-to-End implementation of Data warehouse and Strong understanding of Data warehouse concepts and methodologies.

Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing and Load Testing.

Proven track record in delivering effective design documents, code deliverables, test strategies and business value to the customer.

Experience in One Automation and Tidal for creating JIL files and Monitoring jobs

Having Expertise knowledge in Teradata and has good exposure on other databases like Mainframe DB2, Oracle and SQL Server.

Expertise in setting up testing environments, defining, creating, documenting, verifying and executing test cases, test scenarios and Test plans.

Quick learner and keen observer, determining external & internal customer needs and well versed in internal & external relationship building. Good team worker and like to be part of a team, which always tries to explore out of box.

Knowledge of Informatica Power Exchange.

TECHNICAL SKILLS:

Operating Systems: UNIX, Linux and Windows XP/7.

Languages: Teradata Sql, Advanced Teradata Sql.

Databases: Teradata V2R12 /V2R13.01/ V15.10.1.4, Oracle 10g, SQL Server,Hive

ETL/BI Tool: Informatica 8.5 / 9.0.1 and 10.2, SSRS, Talend

Teradata Tools & Utilities: BTEQ, Fast Load, FastExport, TPUMP, Multi Load, and SQL Assistant.

Scheduler: One Automation, Tidal

ACADEMICS:

M.C.A (Master of Computer Applications) from Andhra University.

Teradata 12 Certified Technical Specialist

IBM certified Database Associate DB2 9

IBM Certified Operator - AIX 6.1 Basic Operations

PROJECT PROFILE:

Verizon Communications, Basking Ridge, NJ

JUL 2018 to till JUL 2019

Teradata Analyst

VCM - Revenue per Customer per Month Billed Revenue per Customer per Month (Billed RCM) is defined as Total Billed Revenue divided by subscribers. Charges associated with non-VZ Wireline carrier codes are excluded. Revenue by customer should be aggregated based on the Global Product Hierarchy at the PR3, PR4 and PR6 level and by Charge Basis Code. The key objective for this project is to add the costs associated with providing TV, Internet (FiOS & Hsi) and Voice services to the RCM2 database in order to derive customer profitability. Cost will be broken out into the following categories: content, equipment (router & set top boxes), bill & payment processing, customer care, acquisition, dispatch (truck rolls for repair & upgrades), taxes (pass through), suspension, overhead and other costs. Once the database is complete, users will be able to extract customer level economics, i.e. Revenue, Expense & Margin.

Responsibilities:

Responsible for end-to-end design and implementation involving complete SDLC

Expertise in Teradata, BTEQ scripts, ETL (Informatica), UNIX, One Automation scheduler

Experienced in interacting with users, analyzing client business processes, documenting business requirements, performing design analysis and developing high level and detail level design for batch processes developed in Teradata and ETL

Experienced in automating and scheduling ETL applications, Teradata BTEQ/SQL Scripts in UNIX via job scheduler (One Automation and Tidal work scheduler)

Experienced in performing Data profiling, Data Analysis, UAT data load and Performing End-to-End production implementation.

Experienced in creating test cases, testing strategy, UAT plan and production Validation Approach.

Experienced in data movement via ETL and Teradata using FAST EXPORT, FAST LOAD, MULTILOAD, IMPORT utility etc.

Well acquainted with dimension modeling and slowly changing dimension concepts

Gathering the required information from the users and preparing the design documents.

Interacted with different system groups for analysis of systems.

Analyzing applications to be changed for particular business requirements

Developing BTEQ, MultiLoad, TPT and FastLoad scripts to populate data into Teradata database.

Developing SQL join indexes to solve strategic and tactical queries.

Implemented SCD4 Logic for Capturing Data changes.

Populating FastLoad and MultiLoad tables using different data load and unload utilities of Teradata.

Generated no of ADHOC reports as per the Client requirements.

Used Multiset tables to load bulk data in the forms of inserts and deletes.

Created indexes, joins on tables as per requirements.

Created various types of temporary tables such as volatile and global temporary tables.

Worked with collect statistics and join indexes.

Involved in unit testing, systems integration and user acceptance testing.

Involved in Preparing Test Cases and performing Testing.

Environment: Teradata R15.10.1.4, Informatica 10.2, Linux 6.8, Tableau

Charter Communications, Charlotte, NC

Nov 2016 to June 2018

Teradata Developer

Project 270 IT workstream is to bring Call data into XDW2 for data integration of Legacy Charter (L-CHR) Legacy Time Warner Cable (L-TWC) and Legacy Bright House Network (L-BHN) data. The scope of this project is to provide the Call Data specific integration design elements needed for project 270 updates to XDW2 providing a single point of truth for all downstream applications and tools. Generating Dynamic Parameter file with date filter condition to extract data from different sources. Based on Filter condition,extracting data from Different sources to Flat file. Loading Data from Flat file to Teradata Staging tables and Persistent Tables. Based on Business logic, Loading data from Persistent to CORE Tables.

Responsibilities:

Gathering the required information from the users and preparing the design documents.

Interacted with different system groups for analysis of systems.

Developing BTEQ, MultiLoad, TPT and FastLoad scripts to populate data into Teradata database.

Developing SQL join indexes to solve strategic and tactical queries.

Implemented SCD4 Logic for Capturing Data changes.

Populating FastLoad and MultiLoad tables using different data load and unload utilities of Teradata.

Generated no of ADHOC reports as per the Client requirements.

Used Multiset tables to load bulk data in the forms of inserts and deletes.

Created indexes, joins on tables as per requirements.

Created various types of temporary tables such as volatile and global temporary tables.

Worked with collect statistics and join indexes.

Involved in unit testing, systems integration and user acceptance testing.

Involved in Preparing Test Cases and performing Testing.

Used One Automation tool extensively for scheduling and monitoring jobs

Environment : Teradata R15.10.1.4, Informatica 10.2, Linux 6.8, SQL Server 2014, Hive

LIBERTY MUTUAL, DOVER, DE

Mar 2014 to October 2016

Teradata Developer

Liberty Mutual Insurance is an American diversified global insurer, and the second-largest property and casualty insurer in the United States. It classifies as Commercial Insurance and Personal Insurance. It offers wide range of insurance products and services, including personal automobile, homeowners, workers’ compensation, commercial automobile, general liability, global specialty, group disability, fire and surety. Commercial insurance consolidation responsible for consolidation of policies from 7 different sources into one common standard format. It collects Policy details, effective date, expiration date, claiming order, renewal order data from other sources and maintain in their database for processing and also to create daily & Monthly files in the specified format.

Responsibilities:

Gathering the required information from the users and preparing the design documents.

Interacted with different system groups for analysis of systems.

Created tables, views in Teradata, according to the requirements.

Developing BTEQ, MultiLoad and FastLoad scripts to populate data into Teradata database.

Developing SQL join indexes to solve strategic and tactical queries.

Populating FastLoad and MultiLoad tables using different data load and unload utilities of Teradata.

Generated no of ADHOC reports as per the Client requirements.

Used Multiset tables to load bulk data in the forms of inserts and deletes.

Created indexes, joins on tables as per requirements.

Created various types of temporary tables such as volatile and global temporary tables.

Worked with collect statistics and join indexes.

Involved in unit testing, systems integration and user acceptance testing.

Involved in Preparing Test Cases and performing Testing.

Environment: Teradata R13, Informatica, AIX, OBIEE, Talend.

AT&T, Dallas, TX

Apr 2012 to Mar 2014

Teradata Developer

DSC is the Data Services Corporation gathers private line billing, customer, network, marketing, sales, financial and product information for the purposes of reporting and information analysis. The business function of DSC can best be described as "Customer Needs Analysis".

Responsibilities:

Gathering the required information from the users and preparing the design documents.

Interacted with different system groups for analysis of systems.

Created tables, views in Teradata, according to the requirements.

Developing BTEQ, MultiLoad and FastLoad scripts to populate data into Teradata database.

Developing SQL join indexes to solve strategic and tactical queries.

Populating FastLoad and MultiLoad tables using different data load and unload utilities of Teradata.

Generated no of ADHOC reports as per the Client requirements.

Used Multiset tables to load bulk data in the forms of inserts and deletes.

Created indexes, joins on tables as per requirements.

Created various types of temporary tables such as volatile and global temporary tables.

Worked with collect statistics and join indexes.

Involved in unit testing, systems integration and user acceptance testing.

Involved in Preparing Test Cases and performing Testing.

Environment: Teradata R13, AIX, SSRS, SSIS

AT&T, Dallas, TX

Mar 2011 to Mar 2012

Teradata Developer

RUSS is the Revenue & Usage Sourcing System and its business function is for Sales and Marketing Needs Analysis Utilizing a RUSS Tracking Database for bill cycle validation and a RUSS month-end reconciliation process, the system is responsible for collecting all billed revenue and minutes of use data for ABS (AT&T Business Services). It directly interfaces and is dependent on information from a various number of ABS billing systems to include the Data Warehouse (DW) system, which receives information from other ABS billing systems and the Tailored Journals system. Files are received almost daily from among 34 billing systems that have one or more number of billing cycles in a month.

Responsibilities:

Involved in writing scripts for loading data to target data warehouse for BTEQ, FastLoad and MultiLoad.

Developing SQL join indexes to solve strategic and tactical queries.

Used Multiset tables to load bulk data in the forms of inserts and deletes.

Created indexes, joins on tables as per requirements.

Created various types of temporary tables such as volatile and global temporary tables.

Worked with collect statistics and join indexes.

Involved in unit testing, systems integration and user acceptance testing.

Environment: Teradata R12, AIX, .Net.

AT&T, Dallas, TX

JAN 2010 to Mar 2011

Teradata Developer

SPDR (Sales Process for Delivering Revenue) pulls billed Classic SBC and Bellsouth customer & revenue information from sources and feeds into the RUSS/SAART applications for Sales compensation. SPDR collects customer data, revenue data, service order data from EDW, WDW, affiliates & other sources.

Responsibilities:

Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet and Transformation Developer.

Study the System requirements and translate into detailed System design requirements, involved in coding, creation and execution of Unit and System Test cases.

Involved in Informatica Migration testing from 8.6 to 9.0.1version.

Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica.

Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Aggregator, look up, update strategy and Sequence generator.

Worked with different sources such as Oracle and flat files.

Knowledge of slowly changing dimension tables and fact tables.

Environment: Informatica, Teradata R5/R12, Oracle, UNIX.

Discover Financial Services, Riverwoods, IL

Apr 2009 – Jan 2010

Teradata Developer

With a vast experience as Teradata developer, it was a unique experience to work for one of the premier financial company Discover Financial Services. The customer database is used in driving, analyzing and execution of marketing programs to improve customer retention and increase revenue. The data warehouse mandate includes logical and physical database design and implementation, business intelligence application development and rollout, and enterprise-wide information and reporting applications. Focused on developing tools and database features that facilitate query-plan analysis and performance tuning.

Responsibilities:

Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.

Writing Teradata sql queries to join or any modifications in the table.

Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.

Database-to-Database transfer of data (Minimum transformations) using ETL (Ab Initio).

Fined tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance.

Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.

Sorted data files using UNIX Shell scripting.

Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.

Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on column property, column value, and referential integrity.

Acted as a single resource with sole responsibility of DataStage – Teradata conversions.

Written scripts to extract the data from Oracle and load into Teradata.

Worked datasets for analysis

Worked with SAS on exporting data using Teradata FastExport.

Written Teradata BTEQ scripts to implement the business logic.

Hands on with Teradata Queryman to interface with the Teradata.

Used SQL Profiler for trouble shooting, monitoring, optimizing SQL Server from developers and testers.

Worked on Cognos 8 Suite (Event Studio, Query Studio, Analysis Studio, and Report Studio).

Experience in designing, developing and maintaining Cognos Impromptu Catalogs, Power Play Cubes and reports

Used UNIX scripts to access Teradata & Oracle Data

Developed UNIX shell scripts for data manipulation

Involved in writing proactive data audit scripts.

Involved in writing data quality scripts for new market integration

Developed complex transformation code for derived duration fields.

Developed BTEQ scripts to extract data from the detail tables for reporting requirements.

Environment: NCR 4800/5100, Teradata V12/V2R6 (BTEQ, FastLoad, MultiLoad, Teradata SQL, FastExport) Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, DB2, SAS, Oracle, DataStage, Cognos, Shell-scripting

Dell, Austin, TX

Jun 2008 – Apr 2009

Teradata Developer

IMD stands for Integrated Marketing Database; IMD clubs all marketing strategies in Dell and allows improving Sales. This is the project captures online and offline campaigns information and their performance over the others.

Responsibilities:

Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.

Coordinated with the database team to create the necessary data sources for PSG (Premier Services) and FA (Financial Accounts) using Metadata Utility.

Involved in the design of complex campaigns for the Business users to accomplish different marketing strategies.

Coordinated with the test team in the design of test cases and preparation of test data to work with different Channels, setup regency and timeout for the Campaigns.

Involved in running the batch process for Teradata CRM.

Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts.

Creation of BTEQ, Fast export, MultiLoad, Tpump, Fast load scripts.

Worked on complex queries to map the data as per the requirements.

Extracted data from various production databases SAS, SYBASE, and Teradata to meet business report needs.

Designed and implemented stored procedures and triggers for automating tasks in SQL.

Worked on some critical problems like booked metrics and solved them successfully using SAS/SQL

Interacted with technical and business analyst, operation analyst to resolve data issues

Analyze the current data movement (ETL (Informatica)) process and procedures.

Used Data profiler to allow the analysis of data directly in the database, which improves performance, while eliminating the time and costs of moving data among databases.

Identify and assess external data sources as well as internal and external data interfaces

Created and updated MS Excel mapping document by doing field level analysis and field mapping.

Environment: Teradata Visual Explain, BTEQ, Teradata Manager, Teradata SQL Assistant, FastLoad, MultiLoad, Fast Export, Rational Clear Quest, Control-M, UNIX, MQ, NDM, FTP, SAS, Ab Initio (GDE 1.14, Co>OS: V1.14)EME, Z/OS, DB2, JCL, SPUFI



Contact this candidate