Reasat Ch
Greater New York City Area
732-***-**** / ***.******@*****.***
Professional Summary:
IBM WebSphere certified Expert Solution Developer Over 10 years of IT experience with special emphasis on system analysis, design, development and implementation of ETL methodologies in all phases of Data warehousing life cycle and Relational databases using IBM Data Stage and QualityStage11.3/8.5/8.1/7.x (Information Server, WebSphere, Ascential DataStage), Oracle PL/SQL, Scribe Insight, Microsoft SQL Server Integration Services (SSIS), MS SQL Server and DB2.
Expertise in Education, Health Care, Banking, Finance and Telecom domains.
Strong Data Warehousing experience using IBM Data Stage Components like DataStage and Quality Stage Designer, Data Stage Director, DataStage Administrator and Parallel Extender, ERwin, MS Visio, SQL, PL/SQL, T-SQL and involved in all the stages of Software Development Life Cycle (SDLC).
Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
Experience in Oracle SQL and PL/SQL including all database objects: Stored Procedures, Dynamic Queries, Stored Functions, Packages, TYPE Objects, Triggers, Cursors, REF Cursors, Parameterized Cursors, Views, Materialized Views, PL/SQL Collections.
Professional hands-on experience in Independent System Analysis, Performance Analysis, Design, Coding, Development and Enhancement in the fields of Data Warehousing, Data Integration, Data Migration and Developing Client server application using IBM Infosphere DataStage/QualityStage (8.x/7.x), Scribe, Oracle, PL/SQL, T-SQL, DB2 UDB, .Net, XML, Bash Scripts, Java, Shell Scripts, PowerShell Scripts, Dynamic CRM in various environments like Development, Integration Testing, Quality Analysis(QA) and Production with no direct supervision.
Data Integration and manipulation for Microsoft Dynamic CRM using Scribe Insight console and workstation.
Thorough understanding of data warehousing principles (Fact Tables, Dimensional Tables, Dimensional Data Modeling - Star Schema and Snow Flake Schema) and techniques for Data Cleansing, CDC (Change Data Capture), Surrogate Key assignment and Slowly Changing Dimension phenomenon (SCD), Staging, ODS, Data Mart and Large scale Integration.
Good understanding of IBM QualityStage to cleanse, investigate and manage data for data warehousing, application migration, business intelligence and master data management projects.
Used DataStage Administrator Client for setting-up properties, permissions, environment variables and project roles to DataStage suite users.
Excellent understanding of DataStage Information Analyzer for importing Metadata to Repository and Data Profiling.
Excellent understanding of the DBA functions including creating tables, table spaces, databases and indexes and experience in UNIX Shell Scripting, Powershell and Windows Batch CMD Scripting.
Good understanding of Dynamic SQL, RDBMS, Hierarchical data models and Object-Oriented technologies.
Exposure & Knowledge of designing and maintaining logical & physical Data Models, Data Marts, Operational Data Store(ODS) in Relational Database Management System (RDBMS) such as Oracle and MS SQL server.
Experience in extracting data from sources like SFDC, SAP, Oracle, DB2, SQL Server, Mainframe flat files, XML files and loading to target tables using IBM DataStage Tool.
Developed Audit Controls and Reports using DataStage Routines, Shell Scripts and Stored Procedures to store the data at several different stages of the ETL processing like Source systems, Staging, Data Warehouse and Data Mart.
Involved in Performance Tuning of ETL codes and Complex SQL Queries. Tuned DataStage jobs and extract SQL queries in Oracle and SQL server.
Experience in supervising and leading development teams for ETL code development, Database and Data Warehouse Design, Unit testing and Documentation across onsite and offshore Application development.
Coordinated with cross functional team like Solution Architecture, Business Analyst, QA Testing, Performance Tuning and Operations team to ensure the timely and successful delivery of projects with little or no direct supervision.
Experience with working in the Agile Development Methodology in all phases of Enterprise Application Development and Critical Production support of complex ETL processes on a roaster basis.
Expertise in creating reusable components such as Shared containers, Local Containers and in creating custom transformations, functions and routines.
Knowledge in Designing the Information Server components and Data Cleansing for IBM MDM integration.
Knowledge in IBM Pure Data Analytics System to provide a single point of support for end-to-end analytics solutions.
Having excellent track record as a strong team player with effective communication, analytical and multi-tasking skills, resourceful, result driven and self-motivated.
Skills/Tools:
IBM Infosphere11.3/8.5/8.1 (DataStage and QualityStage, Information Analyzer, Metadata Workbench), Ascential DataStage, Enterprise and DataStage 7.5/7, Microsoft SQL Server Integration Services (SSIS), Oracle 11g/10g/9i/8i, T-SQL, TOAD, SQL * Loader, SQL Server 2005/2008, IBM MDM, IBM PureData Analytics System, MS Excel BI, MS Office Suite, Windows XP/NT/2000/98, C, C++, TeradataV2R5.x/V2R6.x, MKS, Autosys, UML, MS Visio, Erwin 7.2, IBM AIX, HPUX, SUN SOLARIS, COGNOS 8.0, JIRA, Control-M, Business Objects, Python, SAP, Java, C, C#, .NET, Scribe Insight 7.6.2/7.6/7.9/7.9.2,Scribe Console, Scribe Workbench, MSSQL Migration Tool For Oracle, Microsoft Dynamic CRM 2013/2015/2016.
Professional Experience:
Sep 2014 – Present (27 months)
NYC Department of Education, Brooklyn, NY
Programmer Analyst III
The New York City Department of Education operates the public schools in New York City and is the largest system of schools in the United States, serving 1.1 million students in over 1,800 schools. The Division of Instructional and Information Technology (DIIT) supports world-class teaching and learning in New York City schools by providing information and technology solutions. The goal of the Data Core Team is to ensure data solution for this complex and huge system for the students, teachers, employees, courses and schools.
Responsibilities:
Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM DataStage components and facilitate the implementation and maintenance of the Data Core Data Warehouse Data Architecture.
Independent Programming experience in Business Intelligence (BI) tools, Data Modelling tools for Complex data integration and ETL (Extract, Transform and Load) and implementation of functional artifacts.
Design, Development and implementation from ORACLE to SQL Server Migration for Student Attendance Real-time System.
Worked with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques. Extensive use of Oracle Pseudo columns like ROWID, ROWNUM, SYSDATE, SYSTEMTIMESTAMP, USER etc.
Wrote Unix Shell Scripts to process the files on daily basis by renaming, extracting, unzipping and trimming unwanted characters before loading into the base table.
Created DataStage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Filter, Change Data Capture, Surrogate Key, Column Generator, Row Generator and Legacy System Data Structure etc.
Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes.
Gather and design requirements definitions, Report Designs, Conceptual Data Models, Data Dictionary, Report Mockups and Detail Documentation of Business Intelligence(BI) applications and reporting solutions.
Development, implementation, enhancement, maintenance, configuration, integration of important Online Analytical Processing Systems(OLAP), Business Intelligence, CRM and Online Transaction Processing System (OLTP) running across the organization.
Scripting in PowerShell, shell to ensure safe and secure FTP, LFTP, SFTP connection and communication for complex encrypted data for commercial off-the-shelf(COTS) applications or systems.
Data and Application Integration using Scribe Insight on cloud and on premise application.
Data replication and migration, connectors and agile customization tools using Scribe Insight for Microsoft Dynamic CRM application.
Using Scribe Workbench to keep the historical data for OLAP system.
Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.
Independently documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
Provided production support for enterprise data warehouse and outside vendor applications.
Deployment, design, analysis and implementation of Business Intelligence (BI) applications and reporting solutions to maintain Data Warehousing methodologies such as Star and Snowflake schemas, Dimensional Models and Enterprise Data Marts.
Presented actively in code review meetings with DBAs, Manager, Data Analysts and Data Architects to gather knowledge about complex architecture and finding optimized design for better performance.
Environment: IBM DataStage and QualityStage 8.5, Oracle 12c/11g, MS SQL Server 2014, WinSCP, Notepad++, Soap UI 4.0.1, IBM Information Server, MS Office Suite, UNIX-AIX, Windows 7, Linux, COGNOS 8.0, .NET, Scribe Insight 7.9, Scribe Workbench 7.9, ESB manager, MSSQL Migration Tool for Oracle, Microsoft Dynamic CRM.
May 2014 – Aug 2014(4 months)
University Of Connecticut, Storrs, Connecticut
Lead Programmer Analyst
University of Connecticut stands among the top 20 public universities in the nation. The purpose of the project is to develop data warehouse to facilitate Kuali Financial System (KFS) and Kuali Financial Data Mart (KFDM). Mission is to implement, maintain, and support the University’s core financial systems by providing the highest quality of customer service, leadership, business analysis, financial report development and training to the University community.
Responsibilities:
Defined Data Mapping rules and Data Transformation Rules. Performed all Data Validation and Data Reconciliation tasks as the Lead Developer.
Worked closely with data modeler and database administrator to understand the business process and participated in gathering of business requirements to load the data warehouse which is designed on the Star schema and RALPH KIMBALL methodology.
Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions
Encapsulated error handling and use of autonomous transaction for logging. Used UTL_MAIL to generate E-mails.
Studied the PL/SQL code developed to relate the source and target mappings.
Helped in preparing the mapping document in ERWIN for source to target.
Designed and developed ETL processes using DataStage Designer Client to load data from Oracle Database to staging database and from staging to the target Data Warehouse database.
Used DataStage stages like Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.
Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
Developed job sequencer with proper job dependencies, job control stages, triggers.
Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions
Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
Configuration, Architecture, Development and Administration of Microsoft Dynamic CRM.
Deployment of the data model and database objects will follow the EIM standards for promotion between: Development > QA > Pre-Production > Production
Troubleshoot the ETL run time errors from Control-M scheduler in the production level.
Debugged the issues from JIRA to ensure smooth ETL process in migration.
Participated in weekly status meetings.
Environment: IBM DataStage and QualityStage 8.5, Information Server, SSH Secure Shell Client, Oracle 11g, Oracle SQL Developer 4.0.1.14, Flat Files, CSV file, Toad, Control-M, Erwin 7.2, Windows 2007, UNIX-AIX, Ultra Edit, Jira, Microsoft Dynamic CRM 2013
Apr 2012 – Apr 2014(25 months)
CVS Caremark, Woonsocket, Rhode Island
Senior ETL Developer
CVS Caremark is a recognized leader in the health and well-being industry. They are committed to improving the health care system. The goal of the project was to develop a Data Warehouse to implement and integrate member eligibility, member information, sales of the products, inventory and service policy information to aid different requirements.
Responsibilities:
Defined Data Mapping rules and Data Transformation Rules. Performed all Data Validation and Data Reconciliation tasks.
Developed several transformations in the process of building Data warehouse database.
Analyzed the source data coming from Oracle, Legacy Systems, XML, SAP, Flat files, Excel files, DB2, SQL Server and Flat Files and worked with business users and developers to develop the Model.
Designed, Developed and managed multiple ETL Routines using DataStage and QualityStage for Claims Processing, Claims Approval and Customer Information.
Used the DataStage and QualityStage Designer to develop processes for extracting, cleansing, transforming and loading data into the Database.
Used different stages of DataStage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, Sort, Column Generator, Remove Duplicates, Modify, Transformer, Sequential files, DS files, CDC, SCD, Pivot.
Used Sequencer Stages like Execute Command, Nested Loop, Email Notification, User Variables Activity, Sequencer, Exception Handler, Terminator Activity, Job Activity and Wait for file activity
Developed several complex ETL processes to improve performance by reducing runtime using different partitioning techniques.
Used DataStage Director for validating, execution, monitoring jobs and check the log files for errors.
Used stage variables for source validations, to capture rejects and used Job Parameters for Automation of jobs.
Created Parameter sets and used them in the jobs and imported them into different environments for later use.
Used Email notification activity in the sequence to notify the results after job execution.
Used Transformer stages with stage variables to map data, create expressions and constraints to match the business transformations.
Data and Application Integration using Scribe Insight on cloud and on premise application.
Data replication and migration, connectors and agile customization tools using Scribe Insight for Cloud Platform.
Used User Variables and Loops to transform the data to load it to target Data Warehouse.
AUTOSYS is used as a scheduling tool to kick of the batch files.
Responsible for DataStage Best Practices Implementation, Performance tuning at all levels and Error Handling methods.
Configuration, Architecture, Development and Administration of Microsoft Dynamic CRM
Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads
Extensively coded PL/SQL and SQL and tuned them for better performance.
Environment: IBM DataStage and QualityStage 8.5/8.1, Information Analyzer, Information Server, Oracle 11g/10g, SQL Server 2008/2005, DB2, XML Files, Flat Files, CSV files, OBIEE, QlikView 9.0/8.5, Autosys, Erwin 7.2, MS Visio, Windows 2003/2007, UNIX, SAP, Toad, Scribe Insight 7.0.1/7.5.2, Scribe Workbench 7.5.0/7.5.1, Microsoft Dynamic CRM.
Jan 2010 – Mar 2012(27 months)
Barclaycard US, Wilmington, Delaware
Senior Data Integration Engineer
Barclay is diversified financial services company with operations around world. The scope of the project was to design/develop DataStage server/parallel jobs on the requirement and extract source data from existing sub-systems from SQL server, Oracle, DB2 and flat/CSV files and load the data into a Data warehouse.
Responsibilities:
Analyzed the various sources and designed and developed server/parallel jobs for extracting data from different databases such as SQL Server, IBM DB2 Mainframe, Oracle, flat/CSV files and sequential files.
Interacted with business users, reporting analysts in gathering information about the business logic to populate the target tables.
Extracted data through heterogeneous sources and loaded data to Oracle tables.
Developed user defined subroutines in BASIC to implement some of the complex transformations, date conversions, complex ETL code validations and calculations using different DataStage supplied functions.
Identified source systems, their connectivity, related tables and fields and ensured data consistency for mapping.
Created source table definitions in the DataStage repository by studying the data sources.
Extensively created parameters in job properties for entering default values while loading.
Worked with Data stage client tools– Designer, Director, Manager and Administrator
Involved in several optimizations on DataStage jobs and improved the performance.
Developed jobs in data stage, using different stages like Transformer, Aggregator, Lookup, Merge, Join, Remove Duplicate, Funnel, Sort, Change Data Capture, Sequential File, Complex Flat file and ODBC.
Validation, monitoring and scheduling of various DataStage jobs was done using DataStage Director. Was also involved in viewing and interpreting of the performance statistics.
Used Information Analyzer for importing Table Metadata and Data Profiling.
Used DataStage Administrator for setting-up properties, permissions, environment variables.
Designed jobs using different Parallel Job Stages such as Join, Merge, Lookup, Remove Duplicates, Filter, Dataset, Lookup File Set, Change Data Capture, Slowly Changing Dimensions, Switch, Modify, and Aggregator.
Data Structure, Data Access, Data Manipulation and Application Integration using Scribe Insight on cloud and on premise application.
Data replication and migration, connectors and agile customization tools using Scribe Insight for Cloud Platform.
Extensively used Hash File Stage, Transformer Stage, Aggregator Stage, Link Partitioner and Link Collector Stages in transforming the source data into the target database.
Involved in high performance tuning of both the Server and Parallel jobs. Extensively used DataStage Director's Job monitoring to perform the tuning of the jobs.
Supported business users during User Acceptance Test (UAT), Assembly Testing and Post Implementation Phase.
DataStage best practices were formulated keeping in mind simplicity and higher performance and later incorporated into the jobs developed.
Developed job sequences for the designed jobs using Job Activity, exec command, E-mail Notification Activities and Triggers.
Design and modify legacy server type jobs in IBM DataStage.
Developed Parallel extender jobs for extracting data from the various sources.
Involved in unit testing, system testing, integration, reporting, and performance testing of the jobs.
Produced extensive documentation for the designed jobs/sequencers for future referrals.
Supported in the delivery of solution-oriented complex reports using Cognos Report Studio.
Environment: IBM DataStage 8.0.1, Meta Stage, ERwin 7.2, Toad, SQL Server 2005, Oracle 10g, Query Analyzer, Enterprise Manager, PL-SQL, Scribe Insight 6.3.1/ 6.5.1, Teradata V2R5.x/ V2R6.x, SQL*Loader, IBM DB2, COGNOS 8.0, Linux, Windows, Control-M.
Dec 2008 – Dec 2009(13 months)
Wright Express, South Portland, ME
DataStage Developer
Wright Express is a leading provider of payment processing and information management services to the U.S. commercial and government vehicle fleet industry. The aim of the Data Warehousing Project involved in designing, developing, testing and implementing complex mappings and workflows using the DataStage in order to facilitate the Client and Network partner details to manage Fuel, Vehicle maintenance, mileage and position tracking.
Responsibilities:
Analyzed the existing ETL process and came up with an ETL design document that listed the jobs to load, the logic to load and the frequency of load of all the tables.
Imported data from various transactional data sources residing on SQL Server, DB2 and Flat files and loaded into Oracle database.
Developed DataStage jobs based on business requirements using various stages like Lookup File, Lookup Stage, Join Stage, Merge Stage and Sort Stage.
Developed several transformations in the process of building Data warehouse database.
Extensively worked on Data Acquisition and Data Integration of the source data.
Implemented various process checks, data checks and mail notifications to ensure the quality of the data that is loaded into the data warehouse.
Writing Complex stored procedures and views for ETL jobs.
Created indexes and partitions in the database to improve the performance of ETL jobs.
Involved in writing shell scripts for reading parameters from files and invoking DataStage jobs.
Created jobs in Autosys Job Management Scheduler to run the jobs in sequence without manual intervention.
Created statistical reports using Cognos.
Environment: IBM DataStage 8.0.1, QualityStage 7.5.2, UNIX, Oracle 10g, SQL Server 2005, DB2, SQL Navigator, Cognos 7.1, XML, SQL, PL/SQL, SQL*Loader, MS Visio.
Sep 2007 – Nov 2008(15 months)
SINGTEL, Redwood City, CA
ETL/Data Warehouse Developer
SingTel provides a wide range of communications services. It is Asia’s leading communications group with operations and investments in more than 20 countries and territories around the world. The goal of the Project was to design a Data Warehouse to provide the company with the accurate, on-time information it needed to more effectively support the decision making processes.
Responsibilities:
Worked with Operations Support staff to evaluate/finalize operations requirements of applications moving into support.
Worked with the project and business teams to understand the business processes involved in the solution.
Involved in designing and development of data warehouse environment.
Involved in implementing Star schema for the data warehouse using Erwin.
Extensively used Ascential’sDataStage Manager, Designer, Administrator, Director, Parallel Extender and Integrity for creating and implementing jobs.
Data Stage jobs created to extract, transform and load data from various sources like relational databases, application systems, temp tables, flat files, XML documents into data warehouses.
Created DataStage jobs to build business rules to load data. Most of the jobs used the ODBC, Aggregation and Sequential stages.
Writing Shell, PowerShell scripts for job automation.
Created local and shared containers to facilitate ease and reuse of jobs.
Used the DataStage Administrator to assign privileges to users or user groups (to control which Data stage client application or jobs they see or run), move, rename or deletes projects.
Developed Shell scripts to automate file manipulation and data loading procedures.
Used debugger to test the data flow and fix the mappings.
Helped reporting team to create business intelligent reports.
Made the user resources more flexible as per the business requirements by exporting Universe & BO reports.
Did Unit Testing and Integration testing of the modules.
Environment: Ascential DataStage 7.5 (Administrator, Designer, Manager, Director) Parallel Extender, Business Objects 5.1, Oracle 10g, SQL Server 2005, Windows 2000/XP, HP-UNIX, PL/SQL, Erwin Data Modeler, Shell Scripts
Mar 2006 – Aug 2007(18 months)
SSD-TECH, Dhaka, Bangladesh
Data Integration Engineer
Systems Solutions & Development Technologies Limited (SSD-TECH) is a renowned software development company operating in Bangladesh and in various countries across the globe. Since its inception in 2004, it has proved its excellence by providing solutions of superior quality to large and small enterprises, Banking and Non-Banking Financial Institutions and Telecom Operators. The goal of the project was to develop a Data warehouse for Fintellegent-Solution for the Non-Banking Financial Institutions.
Responsibilities:
Communicated with US based Offshore Clients for Requirement Analysis and project Management
Used the DataStage Designer to develop jobs to extract, transform and load data from various source systems into the Data Warehouse.
Created user defined routines and transform functions (in DataStage Manager) to carry out the complex data cleaning and data transformations in the transformer stage.
Extensively worked on Job Sequencers for automation of the various ETL processes.
Participated in reviews of data modeling and business requirement analysis and assisted with defining requirements.
Assisted with production support and fixing the issues encountered in test and production environments.
Verified various data transformation from source to destination.
Verified rejected data from job stages with all details.
Load testing of transformer, lookup, join and aggregator stages in DataStage.
Functional testing of complex transformer jobs
Writing complex codes using .Net and C and testing them for VAS(Value added Services) applications for Offshore telecom companies.
User interface and Usability testing.
Test Case Writing & Test Run Execution.
Technical Documentation of the DataStage jobs.
Tested various reports generated from the data marts.
Environment: Ascential DataStage 7.5, Oracle Server 10g, MySQL, MS SQL Server 2005, HTML, Java Script, Word press, JIRA, Mantis, Ms Word, Ms Excel, Microsoft Visio, Python, C, .Net
Education:
Master of Science in Electrical Engineering and Computer Science
University of Bridgeport
Bachelor of Science in Electronics and Telecommunication Engineering
North South University