Post Job Free

Resume

Sign in

Data Developer

Location:
Montreal, QC, Canada
Posted:
October 12, 2016

Contact this candidate

Resume:

Professional Summary:

An enthusiastic IT professional with (5+) years of IT experience in Design, Data Analysis, Development, Testing and Configuration Management of Data warehouse Projects.

Domain expertise lies in Banking and Insurance

Highly experienced in all aspects of SDLC such as requirement analysis, design, development, testing, implementation, deployment & maintenance of projects. Capable of handling responsibilities independently.

Excellent in working in a cross functional development environment with Agile development practice

Extensively worked on design and implementation of dimensional data warehouse with star schema using various Extraction methodologies that best suits the business need. Implemented different Transformations to massage the data and Load it in the target data marts

Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP

Worked on designing the ETLs to move the transactional data to the warehouses nightly and in near real time

Experienced in working with type 1, type 2 slowly changing dimensions and Facts.

Expertise in working with different sources including flat files (delimited and fixed width), RDBMS tables, csv files, COBOL files and XML files.

Excellent knowledge in RDBMS concepts and constructs along with Database Objects creation such as Tables, User Defined Data Types, Indexes, Stored Procedures, Package, Views (logical/physical), User Defined Functions, Cursors, Collections and Triggers

Designed and developed Reusable Transformations, Mapplets, Worklets, dynamic mapping Parameters / Variables and Workflow Variables assignments from one to other at runtime.

Experience in optimizing performance of various long running queries and stored procedures by eliminating cross joins, introduced indexes, partitioned tables, parallel query hints and Query rewrite using caching result set.

Good knowledge of working with analytical functions(Lead, Lag, Dense Rank, over, row number etc.) to aid in statistical reporting

Excellent in UNIX commands used in ETL environment and scripts utilizing informatica command level utilities to run workflows by using pmcmd and looped workflows to run in near real time.

Design, code and implement Single Customer View solution to create the best breed within the match.

Design, code and implement data masking process for different data sources.

Develop UNIX shell scripts to package DS jobs deliverables and provide support.

Worked on End to end Performance testing for specific releases.

Troubleshooting aborts & issues with DataStage jobs & UNIX scripts.

Batch/DRF files creation & modification as per requirements. Job scheduling & monitoring in Autosys.

Direct communication with clients/users on day to day basis.

Code creation for validation/business rules & reports

Error report generation code/post-processing

Technical Skills:

Programming Languages

PL/SQL, T-SQL, Unix Shell Scripting, HTML, C, C++, Core JAVA

Operating Systems

Windows,UNIX

Databases

Oracle 11g/10g, SQL Server 2008, DB2 UDB 9, Netezza

Modeling

Dimensional Modeling using Star and Snowflake Schema, ERWIN, MS Visio

ETL Tools

Informatica Power Center 9.x/8.x, SQL * Loader, DataStage/QualityStage

Other Tools and Utilities

Eclipse Java development tools (JDT), SharePoint, Toad, SQL developer, Sql Developer data Modeller, Autosys, Basic OBIEE reporting, Reporting in Excel

Domain Knowledge

Banking, Health Insurance and Retail

Sr. Informatica Developer

BNP Paribas, Montreal June 2016 to Present

Description: As a member of ETL team, I worked on multiple projects at BNP Paribas such as IHC and OFSAA Projects.

Responsibilities:

Worked with Stakeholders regarding business requirements, functional specifications and enhancements, based on the business needs created technical design and functional specification documents.

Developed and assisted data modeller in designing logical and physical data models that capture current state/future state data elements, data flows and work flows using Erwin and MS-Visio.

Developed Complex Mappings in Informatica using Power Center transformations (Lookup, Joiner, and Rank, Source Qualifier, Sorter, Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations), Mapping Parameters, Mapping Variables, and Mapplets & Parameter files.

Scheduled the Workflows to run on a daily and weekly basis using Autosys Scheduling tool.

Maintained workflow logs, target databases, code backups and also performed operational support and maintenance of ETL bug fixes and defects.

Involved in assigning the ticket to the Informatica production support based on the error and also worked on solving issues pertaining to the ticket issued.

Supported migration of code between production and testing environments and also maintained the code backups.

Performed tuning of the Informatica mappings using dynamic Cache, round-robin and PL/SQL scripts.

Actively coordinated with testing team in writing test cases and performing them and helped the team to understand the dependency chain of the whole project.

Performed data analysis and presented proposals for issue categorizations to stakeholders.

Provided Knowledge Transfer and created extensive documentation on the design, development, and implementation of daily loads and workflows for the designed mappings.

Environment: Informatica Power enter 9.5/9.1, Oracle 11g, SQL Server 2005, flat files, SQL Developer, DB2, Teradata V2R5, Erwin R7, SQL, PL/SQL, Shell Scripting, Autosys and OBIEE.

Sr. Informatica Developer

Vancity bank, Vancouver August 2015 to May 2016

Responsibilities:

Involved in gathering requirements from Business Analysts/System Analyst and managed discussions and documented them.

Developed mappings in Informatica Power Center to load the data from various sources using transformations like Source Qualifier, Expression, Lookup (connected and unconnected), Aggregator, Update Strategy, Filter, Router etc.

Created parameter files in Informatica Power Center and passed them to Power Center Tasks.

Tuned Informatica Power Center mappings for better Performance.

Responsible for identifying reusable logic to build several Mapplets which would be used in several mappings.

Created mappings to extract and de-normalize (flatten) data from XML files using multiple joiners with Informatica Power Center.

Created the transformation routines to transform and load the data.

Wrote Oracle PL/SQL procedures for processing business logic in the database. Tuning of Database queries for better performance.

Written UNIX Korn shell scripts along with Autosys for scheduling the sessions and workflows.

Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL scripts.

Developed Test Plans and written Test Cases to cover overall quality assurance testing.

Environment: Informatica Power Center9.1, TOAD for Oracle, Oracle11g, SQL developer SQL Server2005/2008, Erwin 4.0, DB2, PL/SQL, Power Builder, Sun Solaris 8.0 and Shell Scripting.

DataStage/ Informatica Developer

SAQ, Montreal Dec 2012 to July 2014

Responsibilities:

Develop DataStage/QualityStage jobs and job sequencers as per requirement to integrate source data into MDM.

Using QualityStage to enhance data quality, including: standardization and matching solution for different department / business purpose (ex: customer matching for marketing or AML team).

Design, code and implement Single Customer View solution to create the best breed within the match.

Design, code and implement data masking process for different data sources.

Develop UNIX shell scripts to package DS jobs deliverables and provide support.

Perform column analysis profiling and business rule analysis in Information Analyzer and DQA, support dev team to get to know the data and help with data mapping document and code table.

Ongoing production data quality investigation, compare match result between MDM and mainframe CRM, bring up proposal for enhancement.

Evaluate and select new tools as part of solution, and create PoT, showcase to Information Security Officer in order to get common understanding in data masking process. It is being designed and built using IBM DS/Optim DataMasking pack.

Cooperate with different production teams in building new data flow process.

Environment: DataStage/QualityStage 8.1 and 8.5, Informatica9.1,DS/Information Analyzer, DS/Optim DataMasking Pack, Unix Script, IBM InfoSphere, DB2, Oracle11g, Netezza, MDM, Mainframe z/OS, DB2 z/OS, CICS, COBOL, JCL and MVS Utilities.

DataStage/Informatica Developer

Strathbridge, Toronto, ON Sep 2011 to Nov 2012

Responsibilities:

Source system evaluation, standardizing received data format, understanding business/data transformation rules, business structure and hierarchy, relationships, data transformation through mapping development, validation and testing of mappings.

Developed Technical Design Documents.

Involved in monitoring Informatica jobs/processes using workflow monitor

Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length, and target based commit interval.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Implemented slowly changing dimension methodology and slowly growing targets methodology for modifying and updating account information and accessing them.

Worked with Type-I, Type-II and Type-III dimensions and Data warehousing Change Data Capture (CDC).

Wrote Unix scripts to back up the log files in QA and production.

Created Mapplets in place of mappings which were repeatedly used like formatting date or data type conversion.

Extensively worked with SQL queries, created stored procedures, packages, triggers, views using PL/SQL Programming.

Involved in optimization and performance tuning of Informatica objects and database objects to achieve better performance.

Offered production support for daily jobs.

Worked extensively in defect remediation and supported the QA testing

Experience in taking repository backup and restoring it, starting and stopping Informatica services and worked with pmcmd commands.

Using QualityStage to enhance data quality, including: standardization and matching solution for different department / business purpose (ex: customer matching for marketing or AML team).

Design, code and implement Single Customer View solution to create the best breed within the match.

Design, code and implement data masking process for different data sources.

Develop UNIX shell scripts to package DS jobs deliverables and provide support.

Environment: Informatica Power Center 8.6, 9.1, : DataStage/QualityStage, Oracle 10g, SQL Server, PL/SQL, Windows XP, ERWIN, Business Objects and UNIX.

Informatica/SQL Developer

InfoTech Hyderabad, India November 2009 to Jul 2011

Responsibilities:

Involved in design, development and maintenance using procedures for getting data from all systems to Data Warehousing system and data was standardized to store various Business units in tables.

Parsed high-level design specifications to simple ETL coding and mapping standards.

Used Power Center for Extraction, Transformation and Loading data from heterogeneous source systems into the target database.

Used stored procedure, views and functions for faster processing of bulk volume of source data.

Develop ETL processes to replicate data from multiple platforms into reporting databases.

Responsible for unit testing and Integration testing.

Assisted in mentoring internal staff on Informatica best practices and skill.

Responsible for multiple projects with cross functional teams and business processes.

Responsible for development and support of ETL routines and designed Informatica loads required for populating the data warehouse and Experience in loading high-volume data, Tuning and troubleshooting of mappings and Created documentation to support for the Application.

Developed ETL process for the integrated data repository from external sources.

Created Functional Specifications for the different Problem Logs to get the approval of the work.

Environment: Erwin 4.0, Informatica Power Center 8.6, PL/SQL, SQL Server 2000, Autosys, HP-UX, AIX 4.3.3, Shell Scripting.

Education

Bachelors in Information Technology JNTU - Hyderabad, Andhra Pradesh



Contact this candidate