Post Job Free

Resume

Sign in

SaaS Experience

Location:
NJ, 08902
Salary:
$55/hr C2C
Posted:
June 24, 2010

Contact this candidate

Resume:

SATISH

DATASTAGE DEVELOPER

ni68y8@r.postjobfree.com

908-***-**** x 114

________________________________________

SUMMARY

• Over 7 years of IT experience in Analysis, Design, Development, Implementation and Testing of Client/Server and Data Warehousing applications.

• Working knowledge of software development life cycle (SDLC) and project methodologies, as well as tools and techniques within each phase.

• 5+ years of in-depth experience in Extraction Transformation and Loading (ETL) processes using Datastage 8.0.1/7.5/7.1/7.0/6.0XE/5.2/EE (Parallel Extender).

• Experience in design and implementation of Star schema, Snowflake schemas and multi dimensional modeling.

• Worked on Datastage client tools Datastage Designer, Datastage Director, Datastage Manager.

• Experience in Designing, Compiling, Testing, and Scheduling and Running DataStage jobs.

• Expertise in data warehousing techniques for data cleansing, Slowly Changing Dimension (SCD’s) phenomenon.

• Experience in working with Parallel Extender for Parallel Processing to improve job performance while working with bulk data sources. Worked with most of the parallel stages applying different partitioning techniques.

• Experience in Quality Stage for Data Cleansing tasks.

• Worked on Quality Stage for address Investigate stage, standardization stage, match frequency stage, country wide segregation of data

• Strong knowledge of Extraction Transformation and Loading (ETL) processes using UNIX shell scripting, SQL Loader.

• Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs using the DataStage Director

• RDBMS experience with Oracle 10g/9i/8i/8.0/7.3, SQL Server, DB2, Teradata including database development, PL/SQL programming, Triggers, Functions.

• Experience in using PL/SQL, SQL*Loader, ODBC, SQL, Toad, UNIX Shell Scripting.

• Experience in integration of various data sources (DB2, DB2-UDB, SQL Server, Sybase, Oracle, Teradata, and Ms-Access) into data staging area and target data warehouse.

• Developed Strategies for Data conversions, Extraction Transformation Loading (ETL) processes.

• Strong experience in Scheduling jobs using UNIX Shell Scripts and Crontab.

• Performed Debugging, troubleshooting, monitoring and performance tuning using DataStage.

• Excellent experience in writing numerous routines and transforms for various specifications such as Date conversions, files based operations such as splitting and moving the file and string conversions.

• Excellent skills in documenting the ETL process to facilitate an understanding of the entire ETL process and incorporate changes as and when needed.

• Excellent communication skills, Good organizational skills, outgoing personality, Self-motivated, hardworking, ability to work independently or cooperatively in a team, eager to learn, ability to grasp quickly.

Education:

Bachelor of Engineering from Kakatiya University, India.

Certification:

Oracle certified professional in Z 007(oracle 9i).

Technical skills:

ETL:

IBM WebSphere Information Server 8.0,IBM WebSphere EE 7.5.1,IBM Websphere DataStage 7.5.1/7.0/6.0 (Manager, Administrator, Designer, Director, Parallel Extender), Integrity, ETL, Data Warehousing, Metadata, Data mart, OLAP, OLTP, SQL*Plus

Reporting Tools:

Oracle Reports 6.0, Cognos 7.0, Crystal Report 7.0,

Databases:

Oracle 10g/9i/8i/8/7.x, DB2, SQL Server 2000, Teradata, MySQL 5.0, Ms Access 7.0

Database Modeling:

Erwin4.5/4.0/3.5/3.0, Star Schema, Snowflake schema, Fact and Dimensions Tables, Physical and Logical Data Modeling, Dimensional Data Modeling

Database Tools:

SQL *Loader, SQL*Plus, TOAD

Environment:

Windows NT/95/98/2000/XP, UNIX, MS DOS, RED HAT LINUX, AIX UNIX, SUNOS 5.10

Professional Experience:

AT&T, Alpharetta, GA. JUL 2009-Till date.

Datastage Developer

AT&T Wireless is the largest communications holding company in the United States. AT&T is the nation's fastest 3G network serving 78.2 million customers and enabling them to travel and communicate seamlessly with the best worldwide wireless coverage offering the most phones that work in the most countries. The project focused on designing a data warehouse to enhance the decision support system. This warehouse is projected to assist Sales Department to categorize their customers based on various patterns like wireless, IPTV, promotion response and geographical area. Using different ad hoc analysis, warehouse is supposed to assist in defining different strategy for each customer category.

Responsibilities:

• Involved in Customer Financial Management project, which deals mainly in Customer Information, Invoices and Payments.

• Experience with all the new features in datastage 8.1 and migrated all the jobs from 7.5 to 8.1.

• Prepared Naming standards documents, Best practices document, Release documents, code migration steps to perform.

• Prepared Functional specification documents, Process flow diagrams, Technical specification document, Mapping Documents for Source to Target mapping.

• Collaborated with Business analysts and the DBA for requirements gathering, business analysis and designing of the technical requirements document.

• Performed OLTP/OLAP System Study, understanding Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modelling.

• Designed and developed Data model using Erwin.

• Designed and implemented slowly changing dimensions (SCD’s) methodologies

• Designed Quality Stage Jobs in order to perform data Cleansing using Investigate Stage, Standardize Stage, Match Frequency, Survive Stage, Reference match Stage

• Used technical transformation document to design and build the extraction, transformation, and loading (ETL) modules.

• Imported various Application Sources (Database tables like Oracle, DB2, flat files) into Ascential Datastage Manager.

• Extracted the data from Web applications using Web Services pack in datastage.

• Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.

• Used version Control for data stage to track the changes made to the data stage project components and for protecting jobs by making read only.

• Wrote SQL/PLSQL Procedures to improve the Performance while loading.

• Developed Server jobs using stages ODBC, Link Partitioner, Aggregator, Transformer, Link Collector, and Hash File etc.

• Migrated Jobs from Development to QA and to Production Environments.

• Written Executed Various UNIX kern Shell scripts before scheduling Jobs.

• Created various types of reports like Master Detail, Cross Tab, Drill Down and Linked reports to enable easy analysis.

• Created Complex reports using User defined functions like @prompt, variable, condition

• Documented self developed reports and universes as well as supported universe and Reports.

• Also Used Perl Scripts in developing Jobs in order to improve the performance of jobs.

• Schedule the Jobs Using Corn Tab

• Analyzed the performance of the jobs and project and enhance the performance using standard techniques

• Used Data Stage Parallel Extender parallel jobs for improving the performance of jobs

• Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.

Environment: IBM web sphere information server 8.1/7.5.2(Designer, Manager, Administrator, Director, Quality stage), SQL, Oracle 10g/9i, MS Visio, Erwin 4.1, Cognos, Windows NT, UNIX (AIX 4.3), CORN TAB, UNIX Shell Scripting .

Masco Contractor Services, Daytona Beach, FL Apr 2008-Jun 2009.

Datastage Developer

Masco Contract Services is a group of independent companies, which are subsidiaries of Masco Corporation, is one of the worlds leading manufacturers of home improvement and building products and a leading provider of services that include installation of insulation and other building products. It has more than 200 divisions and 52 distribution sites in the United States. As a part of Vision to reality project, Masco is in process of developing a centralized system for handling the Order Management, Billing System in Oracle Applications.

Responsibilities

• Involved in implementing Star/Snow Flake schemas for the data warehouse using ERWin for Logical / Physical data modeling and Dimensional Data Modeling.

• Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment

• Involved in the identification and analysis of the source data for performing the ETL operations

• Interacted with end users in finalizing the requirements and documented the Program Specifications for the ETL jobs

• Developed various business processes and Context Diagrams to find new ways of doing certain tasks, which resulted in efficient processes, cost and time savings. Develop Proof of concept for model ideas

• Developed Data Stage Parallel Jobs where in using required stages, data from different sources formatted, Cleansing, summarized, aggregated and transform into data warehouse

• Designed several parallel jobs using Sequential File, Dataset, Join, Merge, Lookup, Change Apply, Change Capture, Remove duplicates, Funnel, Filter, Copy, Column Generator, Peek, Modify, Compare, Oracle Enterprise, Surrogate Key, Aggregator, Transformer, Decode, Row Generator stages

• Analyzed the performance of the jobs and project and enhance the performance using standard techniques

• Developed Jobs to extract the data from Web Applications by using Web Services Packs,

• Created Master Job Sequencers to control sequence of Jobs using job controls.

• Extensively worked with Job sequences using Job Activity, Email Notification, Sequencer, Wait for File activities to control and execute the Data stage Parallel jobs.

• Created materialized views, Analyzed Query Plans, Partitioning and Indexing to increase the speed of queries.

• Created PL/SQL Procedures, Functions and triggers on Database tables before loading to check some validations.

• Migrated jobs from development to QA to Production environments.

• Defined UNIX -shell scripts for file watcher and file archiving process.

• Generate the pre production reports with the help of reporting team based on Data Mart Using Business Objects.

• Developed complex queries using different data providers in the same report

• Published reports to users to their e-mail addresses by using Broadcast Agent Publisher.

• Worked with supervisor module in creating users and users groups for different areas and setting privileges to them.

• Used Shared Containers for Server Jobs & shell scripts for job sequences, for handling rejected data, handling NULL values and complete email reporting of data changes for production support

• Used Teradata Bulk Load stages to Load the data into Tearadata Database.

• Extensively developed Data stage server routines using Data stage Basic Language as part of the development process.

• Automated the scheduled Jobs using Corn tab Scheduling tool

Environment: IBM Websphere Information server 8.1/7.5.2(Designer, Manager, Administrator, Director), SQL, DB2/UDB 8.1.5, Erwin 4.1, Windows NT, UNIX (AIX 4.3), SQL,Oracle 10g/9i, Teradata, UNIX Shell Scripting .

BroadRidge, jersey city, NJ Feb 2007-Mar 2008.

ETL Developer

Broadridge Financial Solutions, Inc. provides technology-based solutions to the financial services industry in the United States, Canada, and the United Kingdom. The company operates through three segments Investor Communication Solutions, Securities Processing Solutions, and Clearing and Outsourcing Solutions.

Responsibilities:

• Data Warehouse is implemented using sequential files from various Source Systems.

• Meeting with Source system users & Business Users to create data share agreements and BI requirements.

• Developed Mapping for Data Warehouse and Data Mart objects.

• Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements.

• Involved in Designing Parallel Extender Jobs.

• Worked extensively with Parallel Stages like Copy, Join, Merge, Lookup, Row Generator, Column Generator, Modify, Funnel, Filter, Switch, Aggregator, Remove Duplicates and Transformer Stages etc extensively.

• Design and Develop ETL jobs using DataStage tool to load data warehouse and Data Mart.

• Performance tuning of ETL jobs.

• Perform data manipulation using BASIC functions and DataStage transforms

• Define reference lookups and aggregations.

• Import relational metadata information for project.

• Define constraints and derivations

• Create master controlling sequencer jobs using the DataStage Job Sequencer

• Create and use DataStage Shared Containers, Local Containers for DS jobs and retrieving Error log information.

• Design, build, and manage complex data integration and load process

Environment: DataStage 7.5, Oracle 10g, MS SQL Server 2005, Erwin, Mainframe, Autosys, Business Objects 6.5/XIR2,UNIX AIX, Visio & Perl.

Maintec Technologies Pvt.Ltd (INDIA) Nov 2005-Oct 2006.

Responsibilities:

• Installed DataStage (Parallel extender) and configured on AIX.

• Configured the ODBC settings in DataStage

• Mapped Data Items from Source System to the Target System.

• Designed the ETL Jobs based on the requirements and wrote the routines.

• Used DataStage Director and its run-time engine to schedule running the solutions, testing and debugging its components, and monitoring the resulting executable versions.

• Scheduled the ETL Job in the Director for the Daily Process.

• Creating IBM DataStage Parallel Jobs to extract transform and load the data using DataStage design toll in the parallel processing mode.

• Used Parallel Extender for splitting the data into subsets and to load data, utilizing all available processors to achieve job performance.

• Moved the ETL Jobs from Development to Production.

• Maintained and Supported the Production Job daily.

Environment: IBM Data Stage 7.5, Profile Stage, Oracle 9i, SQL, UNIX AIX 5.3, Windows NT.

T&P Technologies (INDIA) Mar 2003-Sep 2005.

Responsibilities:

• This involved participating in a lot of meetings and discussions with the users in the department.

• Integrally involved in creating a database to include sequence generators, indexes, and foreign key constraints to implement referential integrity. Stored procedures and database triggers were created using PL/SQL.

• Developed a user-friendly menu-driven interface, with extensive use of OracleForms4.5.Implemented master-detail relationships wherever necessary.

• Developed coding for the module like developing Master forms, Transactions, Query Forms and Reports.

• Developed reports for the analysis of daily activities using Reports 2.5.

• Developed interface programs using PL/SQL, SQL* Loader to transfer data back and forth from Payables system to Purchasing.

• Prepare Test Specification documents and writing the Test cases

• Performed Black Box Testing and White Box Testing

• Extensively worked for Functional and System Testing.

• Extensively worked with GUI, functional testing and methodologies

• Performed Regression and Integration Testing

• Writing the Test Scripts, Executing the Test Cases and Performed the User Acceptance Testing.

Environment: Oracle 7.x, Windows NT, Designer 2000, Developer/2000, PL/SQL, and SQL.



Contact this candidate