Post Job Free

Resume

Sign in

Data Developer

Location:
United States
Posted:
August 20, 2016

Contact this candidate

Resume:

BHARATH DANTULURI

acv9er@r.postjobfree.com

248-***-****

Summary

Over 9.5 years Years of IT Professional Experience in Analyzing, Designing, Developing, Testing, Implementing and Maintaining Data Warehouse business systems.

Experience in ETL (Data Extraction, Transformation and Loading) using IBM InfoSphere Information Server 11.3/8.5/8.1 (DataStage, QualityStage),using DataStage Designer, Director, Administrator and Parallel Extender to implement ETL Solutions and Data Warehousing Projects.

Having 6 years of experience in Unix Shell Scripting,SQL

Extensive experience in Analysis, Design, Data Extraction, Cleansing, Transformation and Loading into Data Marts.

Experience in Dimensional Data Modeling (Star Schema, Snow-Flake Schema) Data Architecture, Business and Data Analysis.

Designed Technical Design Specifications and Mapping Documents with Transformation Rules.

Extensively worked on DataStage Parallel Extender Edition.

Used both Pipeline and Partition Parallelism for improving performance.

Experience in DataStage Cluster System Setup and Configuration.

Experience in developing Parallel jobs using various stages like Join, Merge, Lookup, Funnel, Sort, Transformer, Copy, Remove Duplicate, Filter, Pivot and Aggregate stages for grouping and summarizing on key performance indicators used in decision support systems.

Having very good experience with Oracle connector Stage.

Frequently used Peek, Row Generator and Column Generator Stages to debug.

Expertise in Software Development Life Cycle (SDLC) of Projects - System study, Analysis, Physical and Logical design, Coding and implementing business applications.

Expertise in performing Data Migration from various legacy systems to target database.

Expertise in Data Modeling, OLAP/ OLTP Systems, generation of Surrogate Keys.

In depth knowledge of Star Schema, Snow Flake Schema, Dimensional Data Modeling, Fact and Dimension tables.

Experience in Data Warehouse development, worked with Data Migration and ETL using IBM DataStage with Oracle, Teradata.

Experience in creating the ETL mapping documents for Extraction,Transformation and loading data in to data Warehouse.

Extensive experience in development, debugging, troubleshooting, monitoring and performance tuning using DataStage Designer, Director, and Administrator.

Prepared job sequences and job schedules to automate the ETL processes.

Experience in handling multiple relational databases like Oracle, Complex Flat Files, Delimited Files, and Teradata for Extraction, Staging and Production data warehouse environments.

Experience with UNIX Shell Scripting for Data Validations and Scheduling the DataStage Jobs.

Used DataStage Version Control to promote DataStage jobs from Development to Testing and then to Production Environment.

Strong analytical, problem solving and leadership skills and has ability to interact with various levels of management to understand requests and validate job requirements.

Team player with strong ability to quickly adapt to any dynamic developments in projects and capable of working in groups as well as independently.

Knowledge on Informatica MDM tool.

Knowledge on Hadoop ecosystem.

Having knowledge on BI tools like cognos.

TECHNICAL SKILLS

Ascential Software DataStage (versions8.0.1, 8.1, 8.7,11.3) Parallel Extender.

Data modeling tools Erwin

Operating Systems Windows XP, 7, UNIX, AIX,Linux

Languages SQL,UNIX(AIX) Script,AWK programing,Sed programing,

Databases Oracle 11g/10g/9i, Teradata,Netezza

Applications and Tools Microsoft Office (Excel, Word, PowerPoint)

Other SQL Developer,SQL Plus,EditPlus,Putty,Toad

Scheduler Tivoli (9.1) Scheduling and Monitoring

Version Control tools SVN

Migration Tools CLM

BigData Tools Hadoop HDFS,Pig,Hive,Hbase,Sqoop,Flume.

BI tool Cognos

Client: MGM Resorts Intl, Las Vegas, NV JUNE 2015 – Present

Role: ETL Lead /DataStage Developer

Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage.

Involved in design of dimensional data model – Star schema and Snow Flake Schema

Worked SCDs to populate Type I and Type II slowly changing dimension tables from several operational source files

Created some routines (Before-After, Transform function) used across the project.

Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File Stage, Sequential file stage.

Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x

Implemented multi-node declaration using configuration files (APT_Config_file) for performance enhancement.

Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)

Debug, test and fix the transformation logic applied in the parallel jobs

Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.

Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics.

Experienced in using SQL developer to populate tables in the data warehouse.

Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.

Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.

Written shell scripts for data validation in AIX server.

Involved in testing the cognos reports and tools.

Repartitioned job flow by determining DataStage PX best available resource consumption.

Environment:DataStage/PX 11.3 on AIX,Oracle,Sql Developer,Tivoli,Cognos,Teradata,Erwin,SVN,Putty,Erwin

Client:Cardinal Health, Dublin, OH JAN 2014–MAY2015

Role: Senior ETL/DataStage Developer

Analyse the requirements.

Design jobs based on the requirements and analysis

Co-ordinating with the team on delivery basis.

Resolving queries raised by the team.

Providing the optimistic solution or suggestion for raised problems.

Coordinating team during the Implementation phase

Requirements Gathering from Business users and Functional analysts.

Interaction with Business Users to understand and clarify the functional requirements

Preparing the Technical specifications based on business requirements.

Preparing the estimations of the projects.

Analysing and understanding the data available in source system

Proposing and reviewing the design based on the requirements.

Design the Integration between different systems.

Facilitate the implementation of the code in Production environment.

Preparation of Technical Design Document, Test cases document, Operation Support document.

Involved in supporting the Production applications and monitoring to ensure all the systems are

available to business round the clock.

Involved in enhancement of existing production application for changing business needs.

Environment:Websphere (DataStage/PX 8.1) on AIX and OracleDatabases for Data Warehouse,Cognos,AWK,SED

Putty,Sql Developer,Tivoli

Client: QBE Americas, Sun Prairie, WI AUG2013– DEC 2013

Role: Senior ETL/DataStage Developer

Responsibilities:

Identified business needs, evaluated business and technical alternatives, recommended solutions and participated in their implementation.

Requirement gathering and business analysis by attending requirement.

Used DataStage Designer to create the table definitions for the CSV and flat files, import the table definitions into the repository, import and export the projects, release and package the jobs.

Developed ETL processes to extract the source data and load it into the enterprise-wide data warehouse after cleansing, transforming and integrating.

Imported metadata from repository, created new job categories, routines and data elements.

Designed and developed jobs using DataStage Designer as per the mapping specifications using appropriate stages.

Worked extensively With AutoSys for scheduling Jobs,Provided Production support for AutoSys.

Used Unix Script to copy files from Development Server to Production Server for testing Datastage jobs.

Worked with Unix admin to resolved datastage bug related error.

Developed job sequences to execute a set of jobs with restart ability, check points and implemented proper failure actions

Wrote UNIX shell scripts to read parameters from files for invoking DataStage jobs.

Created source to target mapping and job design documents from staging area to Data Warehouse.

Used DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions (on an ad hoc or scheduled basis).

Worked on troubleshooting, performance tuning, performance monitoring and enhancement of DataStage jobs.

Wrote several complex SQL queries to extensively test the ETL process.

Provided production support and performed enhancement on existing multiple projects.

Worked with Autosys for setting up production job cycles for daily, weekly monthly loads with proper dependencies.

Performed unit testing and system integration testing in dev and UAT environment.

Environment:Websphere (DataStage/PX 8.0.1) on UNIX and Oracle for Data Warehouse,Shell Scripting,Teradata,Sql Developer,Autosys

Client: Bank of America, Charlotte, NC JAN 2013– JUL 2013

Role: Senior ETL/DataStage Developer

Created DataStage job templates for Change Data Capture and Surrogate Key Assignment processes.

Provide the development team with DataStage job templates that would be used as a standard for ETL development across the project.

Wrote Shell script to automate Data Stage jobs.

Conducted a detailed analysis of relevant heterogeneous data sources like flat files, Oracle to use for the ETL process.

Developing Jobs, Based on the given Functional Design Documents. Coding and Testing Supported for User Acceptance testing.

Involved in production implementation process.

Used the Data stage Designer to develop process for extracting.

To implement and maintain the ETL (Extraction, Transformation, Used to develop the jobs/Cleansing) Process using Data stage.

Provide the development team with DataStage job templates that would be used as a standard for ETL development across the project.

Performed Import and Export of Data Stage components and table definitions using Data Stage Manager.

Batches for different regions, scheduled to perform data loading.

Involved in Unit testing and preparation of test cases for the developed jobs.

Conducted a detailed analysis of relevant heterogeneous data sources like flat files, Oracle to use for the ETL process.

Wrote Shell script to automate Data Stage jobs.

Environment:Websphere (DataStage/PX 8.0.1) on UNIX and Oracle for Data Warehouse,Shell Scripting,Teradata,Sql Developer.

IBM India,India JAN 2007– DEC2012

Client: Vodafone India

Role: ETL/DataStage Developer

Responsible for development, maintenance, and support of ETL templates and jobs to load source system data into Oracle warehouse and mentoring of team members.

Utilize DataStage to design, build, and support jobs used to populate Oracle warehouse.

Manage migrations from development to test to production.

Create Tivoli jobs and scheduling to run cycles in test and production.

Manage all operations of the data warehouse

Create DataStage job templates for developers to follow when creating new jobs.

Mentor team of 8 developers on usage of DataStage to build, run, and test jobs and sequences.

Analyze and determine jobs and sequences that need to be modified or created and assign such to development team members.

Create/Setup new DataStage projects including Unix directories and environment variables.

Track team members assignments, completion dates, and progress through use of Excel spreadsheet

Environment: DataStage/Server 7.5.2 on AIX and Oracle Databases for source and Data Warehouse,Netezza.



Contact this candidate