Post Job Free
Sign in

Data Manager

Location:
Alpharetta, GA
Salary:
100000
Posted:
March 28, 2018

Contact this candidate

Resume:

Eshwari Ballari

Cell: 404-***-****

E-mail: ********@*****.***

Summary:

Over 13 Years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems in Data warehousing for business system like Power, Utilities, Entertainment and Insurance industries.

Over 9 years of experience in ETL methodologies for supporting data extraction, transformations and loading processing using Teradata,Informatica, SQL Server

Knowledge of different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.

Proficiency in using Informatica Power Center tool to design data conversions from disparate source system.

Proficient in understanding business processes/ requirements and translating them into technical requirements.

Expertise in database programming in writing of the SQL, Stored Procedures, Functions, Triggers, Views in Teradata, Oracle, DB2 & MS Access.

Extensively utilized Business intelligence reporting Tools such as Business Objects 5.X/6.X

Extensively worked on Database utilities like SQL*Loader, DTS, Teradata Utilities (Fast Load, Multi Load, Fast Export, Tpump, queryman, BTEQ) and Toad

Experienced in UNIX Shell scripting as part of file manipulation and text processing.

Involved in all the stages of the software development cycle, namely design, specification, coding, debugging, testing (test plan and test execution), documentation and maintenance of the programs.

Strong Communication, Presentation and Interpersonal skills with excellent problem solving capabilities.

Certifications:

Teradata Certified Master V2R5

Technical Skills:

DWH/ETL Tools: Informatica Power Center 9.X/8.X/7.x, Informatica Power Mart and Informatica Power Exchange, IBM Web Sphere Data stage 11

Reporting Tools: Business Objects 6.x

Databases: Oracle 10g/9.x/8.x/7.x, Teradata, SQL Server 2008,2008R, 2012, MS Access

Tools: SQL*Loader, TOAD, QTP 8.2, Test Director, JSC

Data Modeling Tool: Erwin 4.0/3.5,MS Visio

Scripting languages: UNIX Shell Scripting

Operating Systems: Windows XP/NT/2000, UNIX, MS DOS

Work Experiences:

Macy's Systems and Technology, GA

Dec’11– Present

ETL Developer with Teradata/SQLServer

Responsibilities:

Understanding Business needs and implementing the same into a functional database design.

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities and IBM Web Sphere Data stage 11

Explored ways to optimize existing ETL processes and enhance their performance.

Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fastload, Fastexport and BTEQ.

Apply broad in- depth business and technical knowledge to resolve production support and sustainment activities.

Working closely with business users and analyze source data to develop the transformation logic.

Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.

Worked with SQL server and Flat file sources. The data was loaded to DB2 database.

Used ETL to extract and load data from Oracle, Flat Files to Oracle DWH.

Involved in Analysis, Documentation and Testing of workflows.

Experience in writing SQL queries and optimizing the queries in SQL Server 2008, 2008R, 2012.

Build SSIS packages to transform source data to target SF objects

SSIS packages to generate monthly forecasting fact/dimension cube data to load into Hyperion Planning database.

Built re-useable SSIS package for logging/audit and movement of table data to vertical bar delimited flat files

Design and development of an extraction, transformation, and loading (ETL) processframework and SSIS packages

Use Data Transformation Services (DTS)/SQL Server Integration Services (SSIS) and ExtractTransform Loading (ETL) tools of SQL Server to populate data from various data sources, creating packages for different data loading operations for application.

Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).

Knowledge and work experience in RDBMS concepts, Views, Triggers, Stored Procedures, Indexes, and Constraints.

Performed data analysis and data profiling using SQL on various sources systems including SQL Server 2008.

Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform data manipulations.

Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables.

Coordinate with team members and administer all onsite and offshore work packages.

Experience creating MDM and Data Governance roadmaps and implementing strategic plans for the future of data management

Utilize data from external provider to properly class MDM data components (customer category, sub-category, etc.)

Developed MDM strategies and solutions utilizing SQL server MDS tools.

Environment: Teradata 15.0, Teradata SQL Assistance 15.0, Teradata Utilities (Mload, Fastload, FastExport, BTEQ), PMON, UNIX Shell Scripting, Oracle 11g, SQL, UNIX,SVN and Mainframes Z0s, SQL server 2008,2008R, 2012, SSAS,SSRS,SSIS, MDS, MDM

CLA, GA

Oct’10– Nov'11

ETL Developer

Responsibilities:

Understanding Business needs and implementing the same into a functional database design.

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities and Informatica Power Center 8.6

Explored ways to optimize existing ETL processes and enhance their performance.

Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fastload, Fastexport and BTEQ.

Apply broad in- depth business and technical knowledge to resolve production support and sustainment activities.

Working closely with business users and analyze source data to develop the transformation logic.

Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.

Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations.

Testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.

Extensively using Transformations like Router Transformation, Aggregator Transformation, Source Qualifier Transformation, Joiner Transformation, Expression Transformation, Aggregator Transformations and Sequence generator Transformations.

Worked with SQL server and Flat file sources. The data was loaded to DB2 database.

Design and development of an extraction, transformation, and loading (ETL) processframework and SSIS packages

Use Data Transformation Services (DTS)/SQL Server Integration Services (SSIS) and ExtractTransform Loading (ETL) tools of SQL Server to populate data from various data sources, creating packages for different data loading operations for application.

Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).

Environment: Teradata V2R5, Teradata SQL Assistance 12.0, Teradata Utilities (Mload, Fastload, FastExport, BTEQ), PMON, UNIX Shell Scripting, and Informatica Power Center 8.6, SQL server 2008,2008R, 2012, SSAS,SSRS,SSIS

Home Depot, Atlanta, GA

May’10– Sept’10

Teradata Developer

Responsibilities:

Requirements analysis, data assessment, business process reengineering.

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.

Explored ways to optimize existing ETL processes and enhance their performance.

Designed the Logical and Physical Data Model (LDM and PDM) on the new Ratings Stream data mart, and enhanced the existing EDW Physical Data Model

Coded complex and highly-optimized SQLs looking up at core EDW in Third Normal Form Physical Data Model to output at denormalized data mart requirements

Create and execute test plans for Unit, Integration, and System test phases

Automated related tasks by developing UNIX shell scripts used to maintain the core EDW

Worked closely with Project Managers, Business Analysts, BI Architect, source system owners, Data Management/Data Quality team to ensure timely and accurate delivery of business requirements

Assisted SIT/UAT testers in optimizing testing scripts and to ensure data accuracy and integrity

Extensive work experience in the design and development using UNIX shell scripting and SQL

Explored experience in the design and development using Abinitio environment for shell scripting.

Environment: Teradata V2R5, Teradata SQL Assistance 12.0, Teradata Administrator 6.0, Teradata Utilities (Mload, Fastload, FastExport, BTEQ), ARC Utilities, Erwin 4.1.4, PMON, UNIX Shell Scripting, Abinitio.

Capital One, VA

Sept’09 – Mar’10

Sr. Software Engineer

Responsibilities:

Designed the Logical and Physical Data Model (LDM and PDM) on the new Ratings Stream data mart, and enhanced the existing EDW Physical Data Model

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities

Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fastload, Fastexport and BTEQ.

Development and maintenance of complex SQL with multiple tables JOIN conditions for the Teradata RDMS.

Extensive experience with Teradata Load / Unload Utilities (e.g., M load, Fast load, Fast Export, BTEQ, etc.)

Involved in the ongoing delivery of migrating client mini-data warehouses or functional data marts from oracle environment to Teradata. Debugged the problems when migrating from Oracle to Teradata (Conversion of data types, Views, tables etc.).

Resolve and work on issues across multiple functional areas. Effectively monitor and take action to ensure coordination and effectiveness of all components and activities and decide on issues requiring escalation.

Use of the Unix Operating System and the development of basic UNIX shell scripts.

Performed SQL Tuning and Error Handling.

Environment: Teradata V2R5, Teradata SQL Assistance 6.1, Erwin 4.1.4, Teradata Utilities (Mload, Fast load, Fast Export, BTEQ), CONTROL M, HP help desk, JSC, UNIX Shell Scripting.

Coca Cola Enterprises, Atlanta, GA

Oct’07- August’09

Teradata Analyst

Responsibilities:

Design, create and regular tuning of physical database objects (tables, views, indexes) to support normalized and dimensional models.

Requirements analysis, data assessment, business process reengineering.

Provide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring.

Index maintenance & analysis.

Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.

Explored ways to optimize existing ETL processes and enhance their performance.

Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fastload, Fastexport and BTEQ.

Apply broad in- depth business and technical knowledge to resolve production support and sustainment activities.

Resolve and work on issues across multiple functional areas. Effectively monitor and take action to ensure coordination and effectiveness of all components and activities and decide on issues requiring escalation.

Help create and maintain operational documentation to support the solution in a production environment.

Designed and Created the Business Objects Universes. Developed classes with Dimension, Detail & Measure Objects and specified the hierarchies. High interaction with Business Users, Business Owners, ETL/data base team to know their Business views in developing and modifying Universe accordingly. Established user groups, users and assigned privileges. Created thin client reports using Web Intelligence.

Specified the Joins and checked the cardinalities. Resolved loops by creating Aliases and Context.

Extensive work experience in the design and development using UNIX shell scripting and SQL.

Environment: Teradata V2R5, MVS, Teradata SQL Assistance 6.1, Teradata Administrator 6.0, Teradata Utilities (Mload, Fastload, FastExport, BTEQ, ARC Utilities, Teradata Manager, PMON, Teradata SQL Assistant, Teradata Administrator, ASF, Teradata Priority Scheduler, JSC, Business Objects 6.5, UNIX Shell Scripting.

GE Power Systems, Atlanta, GA

Feb’07 – Sept’07

Analyst

Responsibilities:

Requirements analysis, data assessment, business process reengineering.

Provide ongoing support by developing processes and executing object migrations, security and access privilege setup and active performance monitoring.

Index maintenance & analysis. Working with ETL leads to formulate ETL approach and appropriately uses Teradata Tools and Utilities.

Explored ways to optimize existing ETL processes and enhance their performance.

Application development that enables data to be moved from legacy systems and external data sources into a staging area for the transformation effort to final data load into the Enterprise data warehouse Teradata, the tools extensively used were Multiload, Fastload, Fastexport and BTEQ.

Designed and Created the Business Objects Universes. Developed classes with Dimension, Detail & Measure Objects and specified the hierarchies. High interaction with Business Users, Business Owners, ETL/data base team to know their Business views in developing and modifying Universe accordingly. Established user groups, users and assigned privileges. Created thin client reports using Web Intelligence.

Specified the Joins and checked the cardinalities. Resolved loops by creating Aliases and Context.

Involved in configuring and installing Business Objects 6.5

Environment: Teradata V2R5.0, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq), Teradata SQL, Informatica Power Center 7.1.3, Business Objects 6.5, Erwin 4.0/3.5, Oracle 9i, UNIX Shell Scripting.

DIRECTV, LA

Apr’06 – Dec’06

Developer

Responsibilities:

Provided development for initial load programs to migrate DirecTV’s commercial databases from Oracle data marts to Teradata warehouse as well as ETL framework to supply continuous engineering and manufacturing updates to the data warehouse (Oracle, Teradata, ODBC).

Involved in the ongoing delivery of migrating client mini-data warehouses or functional data marts from oracle environment to Teradata. Debugged the problems when migrating from Oracle to Teradata (Conversion of data types, Views, tables etc.). Used Informatica (ETL) to load the data to Data warehouse Systems

Created parameter files for the workflows and mappings for various applications to use different environments like Development, Model Office (Testing) and Production. Created complex mappings to compare two databases Oracle and Teradata.

used Workflow Manager for Creating, Validating, Testing and running the workflows and Sessions and scheduling them to run at specified time. Implemented performance tuning with the Transformations, Sessions and on the Source Databases to load the data efficiently in to the target

Developed Fast Load, Multi Load and BTEQ scripts to load data from various data sources and legacy systems to Teradata

Environment: Informatica Power Center 7.1.3, Teradata V2R5.0, Teradata Utilities (Fast Load, Multi Load, Fast Export, SQL Assistant, Bteq) Teradata SQL, Oracle9i, Unix Shell Scripting, Windows XP

Ameri Group, VA

Aug’04 – Mar’06

ETL Developer

Responsibilities:

Business Analysis and Requirement Collection.

Understanding Business needs and implementing the same into a functional database design.

Working closely with business users and analyze source data to develop the transformation logic.

Using Workflow manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.

Developing number of Complex Informatica Mappings, Mapplets and Reusable Transformations.

Testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.

Extensively using Transformations like Router Transformation, Aggregator Transformation, Source Qualifier Transformation, Joiner Transformation, Expression Transformation, Aggregator Transformations and Sequence generator Transformations.

Worked with SQL server and Flat file sources. The data was loaded to Oracle and XML targets.

Optimizing Query Performance, Session Performance and Reliability.

Environment: Informatica Power Center 7.1, Oracle 9i, MS SQL Server 2000, Unix Shell Scripting, Business Objects 6.x, XML, MS PowerPoint, TOAD, Unix, Win NT 4.0, Erwin 4.0,

PMC Pharmaceuticals, PA

Jul’02- Jul’04

Programmer Analyst

Responsibilities:

Involved in analysis of Extraction, Transformation and Loading process. Involved in creation of the mappings between sources and target tables. Used different transformation like Expression, Aggregator, Source Qualifier, Sequence Generator, Filter, Router, Update strategy and Lookup Transformations. Created reusable transformations. Involved in unit testing as well as Functional testing. Created the sessions and Batches using Server Manager. Involved in debugging mappings and tuned for better performance. Created staging Tables, Indexes, Sequences, Views and performance tuning like analyzing tables, proper indexing.

Identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions to improve performance. Extensive use of Transformations like Joiner, Aggregators, Connected & unconnected lookups, Filters, Sequence generators, Routers, Expression, Sorter etc.

Environment: Informatica 5.1, Business objects, Oracle 8i, Windows NT

Education:

Bachelor of Science - Hyderabad, INDIA



Contact this candidate