Resume

Sign in

Data Developer

Location:
Columbus, Ohio, United States
Posted:
February 11, 2019

Contact this candidate

Resume:

Samrat Chatterjee

Mobile: +1-203-***-****

E-mail: ac8gi7@r.postjobfree.com

http://www.linkedin.com/in/samrat-chatterjee-406570a

PROFESSIONAL SUMMARY

A forward thinking, capable and committed Senior Data warehouse consultant having over 10 years of experience in implementing Data warehousing and Data lake applications and Denodo data virtualization applications with different tools and multiple backend databases.

Data virtualization experience:

Denodo 5.5 and Denodo 6.0

Involved in designing and creating reports in Denodo platform

Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL/Denodo processes.

Hands-on experience in creating the Denodo views through using JDBC, ODBC, Delimited files and

SOAP/REST Web services

Creating Denodo views and derived views in VDP with connecting multiple data source systems.

Creating and Scheduling the Denodo jobs with Web scheduler.

Monitoring overall data quality for assigned data entities and capturing DQ rules for the same.

Data warehouse experience:

Well versed with Data warehousing concepts and good understanding data lake concepts and architecture.

Expertise in the data analysis for source and target systems and Data architecture.

Expertise in designing and implementing ETL environments using tools like Informatica and backend databases, oracle and Greenplum

Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica Power center 10.1/9.6.1/9.1/8.6.1/ 7.x/6.2

Had experience in doing Gap analysis between source feeds and master data

Had experience in Devops environment and understanding of operational challenges

Strong understanding of data warehousing – Star/Snowflake schema

Preparing logical and physical data models using ER studio

Implemented performance-tuning techniques at application, database and system levels.

Having good exposure in UNIX shell scripts with Informatica.

Experience in dealing with a software development process from requirements through implementation

Wide range of IT industry experience with Banking/Finance with strong analytical, problem-solving, Organizational, communication, learning and team skills.

Experienced with coordinating cross-functional teams.

Excellent communication and interpersonal skills. Ability to work effectively as a team lead as well as an Individual.

Good understanding on collateral management and capital investment domain terminology.

Ability to quickly adapt to changing priorities, requests, and environments.

Adept in UNIX operating system commands & Shell Scripting.

Experience with scheduling tools Control-m, Appworx. and window scheduler

Experience in implementing data virtualization projects

Informatica product experience:

Informatica Power center

Strong development experience in Informatica power center 7.1.4, 8.1.1, 8.6.1, 9.1.1, 9.5.1 and 10.1.

Extensively Worked on Informatica Designer Components- Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.

Strong experience in creating transformations like Expression, Router, Joiner, Lookup, Update Strategy and Sequence Generator are mainly.

Hands on experience working with reading data thru XML sources and Targets, working thru web

Services. Configuring Informatica workflows to read through Web services.

Experience in implementing Informatica change data capture (CDC).

Involved in the installation and administration of Informatica server.

Used Informatica Metadata Manager and Metadata Exchange exhaustively to maintain and document metadata.

Data base experience:

Oracle/Oracle Exadata/Pivotal green plum

Strong experience in programming using SQL.

Good understanding of green plum databases and capable of writing complex sql’ s and functions

Extensively involved in creating Oracle PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, and

Indexes with Query optimizations as part of ETL Development process.

Expertise in designing data lake environments using HVR and Pivotal Green plum.

Experience in loading data from different sources into Greenplum system.

Implemented parallel data unloading and loading using external table with gpfdist utility.

Experience in performance tuning.

Implemented some of the ETL processes using Linux shell scripting and PL/pgSQL language.

Good understanding of data distribution, table partitioning and compressions in Greenplum.

Experience in designing data models.

Experience in implementing Enterprise Data Warehouse using Greenplum as Target system.

Education

Degree

University

Year of Passing

B.TECH in IT

West Bengal University of Technology

(06/2006)

Experience

Organization

Designation

Duration

Collabera

Denodo Lead

(04/2018) – Till Now

Tech Mahindra Americas Inc. (USA)

Project Lead

(11/2017) – (04/2018).

Tech Mahindra

Team Lead

(08/2014) – (10/2017)

US Technology International Pvt. Ltd.

Sr. Systems Analyst

(11/2013) – (08/2014)

Tejosma Technologies Pvt. Ltd.

Sr. MTS - Engineer

(09/2008) – (10/2013)

Indian Institute of Science - Bangalore

Systems Engineer

(04/2007) – (08/2008)

Technical Skills

Operating Systems

UNIX (RHEL, Sun Solaris, Fedora), Windows.

Languages

SQL, PL/SQL, UNIX Shell Scripting, XML, Webservices.

Databases

Greenplum, Oracle 11g/10g/9i, PostgreSQL, Teradata,MySQL, MS SQL SERVER.

GUI

Informatica Power Center 10.1/9.6, Talend.

Web Related

Informatica Metadata Manager

Data Virtualization

Denodo 5.5 and 6.0, Denodo Administration

Data Modeling Tools

Erwin, ER studio 9.0

Tools & Utilities

SQL Developer, PL/SQL-Developer, Toad, Textpad, Navicat, WinSCP, Putty.

Domain Knowledge

Finance domain – Banking & Finance, Shipping.

Big Data Ecosystems

Hadoop, MapReduce, HDFS, HBase, Hive, Talend and Spark.

Projects Profile

6.

Project Name: Promise 2020 - Transform

Client

Nationwide, Columbus, Ohio, USA

Role

Lead Denodo and ETL Developer

Organization

Collabera, USA

Duration

(04/2018) – till date

Environment

Software

Tools : Denodo 5.0 and 6.0, Informatica PowerCenter 10.2

Database : Oracle 12c / Exadata, Sql Server, Netezza.

Project Description

Nationwide does business in different products like Life Insurance, Banking, Retirement and Investments to Customers all-round the US. As part of the Project data is sourced from UD and TPP which are part of internal Nationwide applications and then based on the Business requirement we build the Views in DB/Denodo level and integrate them at Denodo and further we publish them via Rest Web services in JSON format to the Nationwide consumer applications so that Business team uses it for their reporting.

Contribution

As a Lead Denodo and ETL Developer, I am responsible for

Responsible for implementation of data virtualization service solutions for the project. Responsible for new environment setup for the solution.

Creating data source connections for Oracle, Netezza, Sql Server etc.

Creating Denodo views and derived views in VDP with connecting multiple data source systems.

Publishing the denodo derived views with REST / SOAP APIs.

Experience working with caching approaches and technologies.

Involved in Denodo Query Optimizations and Cache Performance Optimization for the Data Warehouse.

Creating and Scheduling the Denodo jobs for Cache refresh.

5.

Project Name : RTS Activities

Client

GE – Capital, Norwalk, USA

Role

Lead ETL and Denodo Developer

Organization

Tech Mahindra Americas Inc., Norwalk-USA

Duration

(08/2014) – (04/2018)

Team Size

10

Environment

Software

Database : Greenplum, Oracle Exadata, Oracle 11g, Mysql.

Tools : Denodo 5.5 and 6.0, Informatica PowerCenter 10.x

O/s : Unix, Windows 10 Enterprise

Project Description

Development of new schemas for new domains at Treasury Data Lake, a warehouse which is large integrated hub holds all GE Capital trading information, financial revenue and costing with around 13 source systems across multiple Legal Entities.

Contribution

As a Lead Denodo and ETL Developer, I am responsible for

Responsible for full life cycle software development including new development, design/architecture, and implementation of data virtualization service solutions.

Involved in creating connections with MySql, Oracle, Microsoft SQL Server, PostgreSQL, Teradata and Denodo VDP Servers.

Creating Denodo views and derived views in VDP with connecting multiple data source systems

Experience working with caching approaches and technologies.

Involved in Denodo Query Optimizations and Cache Performance Optimization for the Data Warehouse.

Collaborate with data architects for data model management and version control.

Conduct data model reviews with project team members.

Capture technical metadata through data modeling tools.

Preparing logical and physical data models using Erwin and ER studio.

Ensure data warehouse and data mart designs efficiently support BI and end user.

Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.

Experience in writing functions in pivotal green plum

Proven experience in writing complex sql queries and functions in oracle and Greenplum

Tuning the performance of mappings and sessions in Informatica and at database level

Done extensive testing and wrote queries in SQL for push down optimization

Responsible for change management and ensuring the quality of the deliverables.

Participating in UAT meetings and co-ordinating with Business to receive business sign-off.

Preparing improvements plans in BI and value adds to the project

Ability to quickly learn applications & Process with minimal documentation

Leading the technical team and guiding the team on technical challenges.

4.

Project Name: Maersk Line- Uptake Management (UTM).

Client

MAERSK LINE, Denmark

Role

Team Lead

Organization

US Technology International Pvt. Ltd, India

Duration

(11/2013) – (08/2014)

Team Size

8

Environment

Software

Database : Teradata 13

Tools : Informatica PowerCenter 9.5.1

O/s : Windows 7 Enterprise

Project Description

UTM Capacity Sheet: The purpose of Capacity Sheet application is to provide String Owner or Capacity Manager View of Capacity at ports and vessels on it. By looking into the capacity of port and vessels Capacity Managers can make various decisions to avoid issues in loading and discharging before confronting same.

As end result this application helps in minimizing the issues at ports and more opportunities and hence optimized shipments at each port.

Contribution

As a team member, was responsible for

Coding using Informatica PowerCenter 9.5.1

Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.

Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.

3.

Project Name: Jaamoon Blink

Client

MF Global Holdings Ltd. New York City, USA

Role

Sr. Systems Analyst / Informatica Developer.

Organization

Tejosma Technologies Pvt. Ltd.

Duration

(02/2011) – (10/2013)

Team Size

8

Environment

Software

Database : Oracle 10g, Sql Server 2008, Mysql Server 5.5

Tools : Informatica PowerCenter 9.1

O/s : Windows 7 Enterprise

Project Description

Jaamoon Blink is an investor CRM product providing intelligent and informative reports to financial institutions. Allows research teams to send calls / opportunities and track their business and call performance. It helps their clients take informed decisions and track market opportunities. Users can inform their clients about these opportunities using multiple channels like Sms, Email, Chat or other third party API’s like Omnesys, Ebuzzetc

Contribution

As a team member, was responsible for

Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, expression, aggregator, joiner, filter, normalizer, rank and router transformations.

Worked Informatica power center tools like source analyzer, mapping designer, mapplet and transformations.

Developed Informatica mappings and also tuned for better performance.

Extensively used Informatica to load data from flat files, Oracle, MySQL database.

Responsible for Performance Tuning at the Mapping Level and Session level.

Worked with SQL Override in the Source Qualifier and Lookup transformation.

Extensively worked with both Connected and Unconnected Lookup Transformations.

2.

Project Name: Broker Trading Portal (Brokerserver)

Client

Axis Securities, SBI Capital, JM Financial, Canara Bank Securities Ltd

Role

Informatica Developer.

Organization

Tejosma Technologies Pvt. Ltd.

Duration

(03/2009) – (01/2011)

Team Size

8

Environment

Software

Database : Oracle 10g, Sql Server 2008, Mysql Server 5.5

Tools : Informatica PowerCenter 9.1

O/s : Windows 7 Enterprise

Project Description

This is a project that deals with creation of eCommerce Services for different brokers with the Data Warehouse. The objective of Financial Data Warehouse is to represent a financial reporting infrastructure that centralizes access to information, data, applications, reporting and analytical systems for the required financial content dealing with different products and services.

Contribution

As a team member, was responsible for

Interacted with the Business Users to analyze the Business Requirements and transform the business requirements into the technical requirements.

Extracted data from various heterogeneous sources like Oracle, MYSQL and Flat Files.

Worked on Designer Tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.

Analyzed business process workflows and developed ETL procedures to move data from various source systems to target systems.

Created PL/SQL procedures to transform data from staging to Data Warehouse Fact and summary tables.

1.

Project Name: Reentry Trajectory Optimization with path constrained High Index DAEs and application to RLVs.

Client

DRDO and Naval Research Board.

Role

System Engineer

Organization

Tejosma Technologies Pvt. Ltd.

Duration

(04/2007) – (08/2008)

Team Size

4

Environment

Software

Tools : Java, J2EE, Shell Scripting

O/s : RHEL 5, Solaris 10

Project Description

The project aims at developing efficient computational tools based on direct dynamic optimization for trajectory design with low energy budget objectives. The differential algebraic equality (DAE) arising from the Reentry-phase of RLV is of high index. In general, there is no efficient method to solve high index DAEs. Here we have used a direct method to solve high index DAEs as well optimal control problems.

Contribution

As a team member, was responsible for Developing and coding for the algorithm created and given.



Contact this candidate