Resume

Sign in

Data Project

Location:
Plano, Texas, United States
Salary:
$130K and negotiable.
Posted:
October 05, 2016

Contact this candidate

SUDHIRANJAN SWAIN (Data Architect)

EMAIL: acwxhm@r.postjobfree.com

Visa Status: H1B Holder

Current Location: Plano, Texas Mobile: +1-214-***-****

Professional Summary:

14+ years of extensive experience in Information Technology with special emphasis on Design and Development of Data Warehouse ETL applications in Banking, Finance, Health Care and Retail domains.

10+ years of experience with leading and designing integration services for data warehouse, data marts and operational data store platforms using Informatica PC 9x/ 8.6.1/7.2/6.2, ODI 11g & BODS 4.0.

5 years of experience in Data modelling of Logical/ Physical Modeling, Dimensional modeling, Star /Snowflake Schema, FACT and Dimension tables.

5+ years of architectural leadership experience delivering ETL solutions within a large enterprise environment

9+ years of Informatica PowerCenter platforms and various components

5+ years of experience working Oracle or SQL based RDBMS using PL/SQL and query development tools

Around 7.0 years of experience in all the phases of Data Warehouse life cycle involving Requirement Gathering, Analysis & Design, Development and Testing to build Data Warehouse.

Hands on experience of developing ETL components/processes using Informatica/ODI and BODS to feed an Operational Data Store and Data Warehouse.

Preparing extensive documents on the ETL Design, Development, Deployment & daily loads of the mappings.

Hands on experience of designing Conceptual, logical, physical (ER) and dimensional data models (Data warehousing data modelling)

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Exposure on Hadoop ecosystems like HDFS and No SQL database (Cassandra 2.1).

Worked as a Cassandra Data Architect using Cassandra data modeling tool KDM

Successfully performed roles of ETL Architect/Data Architect in an onsite-offshore model with team of 18 members and managed multiple projects simultaneously.

Primary responsibility of involving feasibility study of business requirements and assisting the team members in developing the ETL components, Interfaces, checking their dependencies and load factors.

Responsible for defining the complete data warehouse architecture (i.e. ODS, ETL process, Data Marts, EDW) and classify the key business drivers for the data warehouse initiative.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

End to end experience of three iterations of logical, physical and dimensional data models.

Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Facilitates the design and drives the planning for the integration of all data warehouse components, with scalability, performance, and availability trends.

Facilitates the adoption of the ETL best practices within the organization, including Data integration and evolution of development methodology.

Drives and performing reviews the design and testing of data models, ETL Components data extracts/transforms and processes and performance & Error handling perspective

Designed the DWH ETL architecture using multiple ETL tools like Informatica, ODI and BODS.

Worked on Teradata utilities like (FLOAD, MLOAD, Bteq and TPUMP) to feed an Operational Data Store and Data Warehouses.

Solid experience in Ralph Kimball Methodology, Logical/ Physical Modeling, Dimensional modeling, Star Schema, FACT tables and Dimension tables.

Expertise in Logical and Physical Data Model design using various modeling Tools like Erwin 7.3 and SQL Developer.

Involved in Development, Implementation and Support of ETL processes (3 Terra Byte) applications.

Managed change control implementations and coordinating weekly, monthly & quarterly release processes.

Setting up Scheduling mechanism and appropriate tool selections based on project requirement functionality.

Trained on Big Data Hadoop framework using Hadoop HDFS, Flume, Sqoop, Hive, Pig and Cassandra for large scale structured and unstructured data modeling, data processing, loading & Data Analytics and very much comfortable to execute POC/projects if situation demands.

Employment Summary:

Organization Name

Durations

Tech Mahindra Ltd.

July/2012 to till date

Rolta India Limited

December/2010--May/2012

BirlaSoft (India) Limited

April/2010--December/2010

Mphasis An HP Company

November/2007--January/2010

Tata Consultancy Services Ltd.

July/2004--August/2007

Synergy Informatics

December/2000--July/2004

Education

Degree

Specialization

University

Year of passing

PG Professional courses

MCA Computers

Utkal University

2000

Skills:

DW Tools

ETL Informatica PC, IDQ, ODI, BODS, BO & Talend

Data Modeling Tools

Erwin & TOAD Data Modeler

Big Data

HDFS, Sqoop, Hive, Pig & Cassandra

RDMS

Oracle, SQL Server & Teradata

Programming Languages

SQL & PL/SQL

Scheduling Tools

AUTOSYS & TIVOLI

Operating Systems

Windows NT/2000/XP, MS-DOS, UNIX

Tools & Utilities

PUTTY, ClearCase & VSS

Web Related

HTML

Domains

BFSI, Health Care, Manufacturing, Retail, Telecom and Transports

Project Details:

Project Name:

Data Layer DMaaP Apps

Client

AT&T

Role

Data Architect/Analyst

Organization

Tech Mahindra Ltd.

Duration

Apr. 2016 to till Date

Team Size

20

Environment

Skills

Informatica Erwin Data Modelling, Oracle, Cassandra, SQL & PL/SQL with Unix environment.

Project Description:

The vision of the Data Access Roadmap is to establish a Common data layer model, which can be leveraged by all domains (Sales, Service Delivery, and Billing) to enable sharing of customer and solution data, and eliminates translation and incompatibilities between sales and service delivery. DMaaP is a premier platform for high performing and cost effective data movement services that transport and process data from any source to any target using their child apps DataRTR, Message Router, GDDN, DTI and Grid AC.

Contribution:

Involving feasibility study of business requirements and assisting the team members in developing the ETL components, Interfaces, checking their dependencies and load factors.

Collaborating with business users to define the key business requirements and translate them into process and technical solutions.

Involved with Data modeling, ETL process and classify the key business drivers for the data management initiative.

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Preparing extensive documents on the ETL Design, Development, Deployment & daily loads of the mappings and interfaces.

Developing ETL components, Interfaces using Informatica PC to feed an Operational & external Data Store and Data Warehouse.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Working as a lead member and contributing on architectural level inputs and suggestions for critical ETL component design.

Project Name:

OPR Dashboard reporting- POC

Client

BNSF Railway Company, USA

Role

Data Architect

Organization

Tech Mahindra Limited

Location

Bhubaneswar

Duration

Jan. 2016 to Mar.2016

Environment

(with skill versions)

Languages:

No SQL

Database :

SQL Server & No SQL Cassandra Database

Tools :

KDM

O/s :

Unix

a) Project Description

The current AS-IS architecture seeks it data from the underlying sources eCap, EDW, ODS and SIDOL for the OPR reporting. While EDW and ODS are DB2 sources, SIDOL is based on SQL server. The business rule application and data extraction and loading are done by both ELT and ETL approach using webFocus.

The TO-BE architecture is purely based on event driven based data extraction approach where we will be using Fuse and Camel for Event related activities. For applying the business rules, Apache Spark will be leveraged while data services will be hosted using Cassandra. The presentation layer will use the latest HTML5 to provide dynamic and interactive user interfaces. HTML5 offers IDE based approach to creating reports effortlessly.

b) Contribution

Responsible for defining the complete data model and classify the key business drivers for this POC initiative.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Created standard abbreviation document for logical, physical and CQL data models.

Created logical, physical and dimensional data models (Data warehousing data modelling)

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Used Model Mart of ERwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.

Designed data cleansing/data scrubbing techniques to ensure consistency amongst data sets.

Project Name:

GE Corp- DW Projects

Client

GE Capital & Corp.

Role

Data Architect

Organization

Tech Mahindra Limited

Location

Offshore: Bhubaneswar

Duration

Offshore: Sept 2012 – Dec. 2015

Environment

(with skill versions)

Languages:

SQL, PL/SQL

Database:

Oracle & Teradata

Tools:

Informatica PC 9.5

O/s:

Unix

a) Project Description

GE Corp. & Capital has some major Data warehouse applications like EDW Commercial Finance for capital lending business, GOF Archival for Data migration of Capital banking of Europe region, Capital Sourcing, MENAT for purchasing systems with suppliers & receipt information, GOPSU and GOPDW for seat utilizations solutions. The ETL processes are used to analyze, extract, transform & load the scoped data from different sources into warehouse systems and Business intelligence processes set up for data reporting, forecasting and trend analytics.

b) Contribution

Responsible for defining the complete data warehouse architecture (i.e. Data modeling, ETL process, Data Marts, EDW) and classify the key business drivers for the data warehouse initiative.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Suggest steps for best possible use of processes for new components.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Configuring configuration management processes and setting up versioned and global repositories in Informatica.

Managed change control implementations and coordinating weekly, monthly & quarterly release processes.

Setting up Scheduling mechanism and appropriate tool selections based on project requirements.

Project Name:

One View

Client

Product Development

Role

Data Architect

Organization

Rolta India Limited

Location

Offshore: Mumbai

Duration

Offshore: December/2010--May/2012

Team Size

Project : 30

Environment

(with skill versions)

Languages:

SQL, PL/SQL

Database :

Oracle & SQL Server

Tools :

Informatica, BODS, ODI

O/s :

Windows XP & Unix

a) Project Description

OneView solution provides process and power industries with comprehensive decision support platform and modules that enable executives to transform their organization. It delivers innovation, insight & impact for refining, upstream oil, petrochemical & power industry by collecting clean and sharing asset performance data. The ETL processes are used to analyze, extract, transform & load the scoped data from different source systems into warehouse system.

b) Contribution

Responsible for defining the complete data warehouse architecture (i.e. Data modeling, ETL process, Data Marts, EDW) and classify the key business drivers for the data warehouse initiative.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Responsible for ETL Software Installation, Validating and configuring server environments.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Suggest steps for best possible use of processes for new components.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Configuring configuration management processes and setting up versioned and global repositories in Informatica.

Managed change control implementations and coordinating weekly, monthly & quarterly release processes.

Setting up Scheduling mechanism and appropriate tool selections based on project requirements.

Project Name:

GSDW (Global Sourcing Data warehousing)

Client

GE Energy

Role

ETL Architect

Organization

BirlaSoft (India) Limited

Location

Offshore: Bangalore

Duration

Offshore: April/2010--December/2010

Team Size

Project : 12

Environment

(with skill versions)

Languages:

SQL, PL/SQL

Database :

Oracle 10g

Tools :

Informatica

O/s :

Unix

a) Project Description

Global Sourcing data warehousing is being developed for client GE Energy, USA. The GSDW data warehousing system developed to controls the integrations of GE Energy purchasing systems with suppliers & receipt information into a single enterprise system. The ETL processes are used to analyze, extract, transform & load the scoped data from 24 ERP & non-ERP system into historical warehouse system.

b) Contribution

Responsible for defining the complete data warehouse architecture (i.e. Data modeling, ETL process, Data Marts, EDW) and classify the key business drivers for the data warehouse initiative.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Documented logical, physical and dimensional data models (Data warehousing data modelling)

Created entity relationship diagrams and multidimensional data models, reports and diagrams based on the requirements.

Suggest steps for best possible use of processes for new components.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Project Name:

AMGEN-BI development Applications

Client

AMGEN, USA

Role

ETL Tech Lead

Organization

Mphasis An HP Company

Location

Offshore: Bangalore

Duration

Offshore: November/2007--January/2010

Team Size

Project : 15

Environment

(with skill versions)

Languages:

SQL, PL/SQL

Database :

Oracle

Tools :

Informatica & PL/SQL Developer, Erwin

O/s :

Unix

a) Project Description

Amgen is a leading human therapeutics company in the biotechnology industry tapped the power of scientific discovery and innovation to advance the practice of medicine. The aim of this project is to bring all kind of Amgen businesses information’s on a single platform for better business management.

b) Contribution

Responsible for ETL Software Installation, Validating and configuring server environments.

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Suggest steps for best possible use of processes for new components.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Configuring configuration management processes and setting up versioned and global repositories in Informatica.

Managed change control implementations and coordinating weekly, monthly & quarterly release processes.

Setting up Scheduling mechanism and appropriate tool selections based on project requirements.

Project Name:

Annuity One

Client

Infineon Technologies, Bangalore

Role

ETL Lead

Organization

Tata Consultancy Services Ltd.

Location

Offshore: Bangalore

Duration

Offshore: October/2005--March/2007

Team Size

Project : 15

Environment

(with skill versions)

Languages:

SQL & PL/SQL

Database :

Oracle 9i

Tools :

Informatica PC7.1.2, Business Objects 5.1.7 & PL /SQL Developer

O/s :

Sun Solaris 2.6

a) Project Description

The scope of Project is to analyze Infineon products like semiconductors and chip design for domestic & international customers. The aim of this project to bring all Infineon businesses on single platform for managing required application.

b) Contribution

Feasibility study of business requirements and assisting the team members in developing the ETL components, checking their dependencies and load factors.

Preparing extensive documents on the ETL Design, Development, Deployment & daily loads of the mappings.

Hands on experience of developing ETL processes using Informatica/ODI and BODS to feed an Operational Data Store and Data Warehouse.

Responsible for defining the testing procedures, test plans, error handling strategy and performance tuning for mappings, Jobs and interfaces.

Performing code reviews for ETL mappings from a performance & Error handling perspective.

Involved in Development, Implementation and Support of ETL processes up to 3 Terra Byte applications.

Configuring configuration management processes and setting up versioned and global repositories in Informatica & BODS.

Managed change control implementations and coordinating weekly, monthly & quarterly release processes.

Setting up Scheduling mechanism and appropriate tool selections based on project requirement functionality.

Project Domain

Banking

Sub Domain

Core Banking

Solution

Internet Banking - Development and Testing

Project Name:

OBTS Release 8.0 & 9.0

Client

Deutsche Bank, Belgium

Role

ETL Consultant

Organization

Tata Consultancy Services Ltd.

Location

Onsite: Belgium

Duration

Onsite: April/2007--June/2007

Team Size

Project : 2

Environment

(with skill versions)

Languages:

SQL & PL/SQL

Database :

Oracle

Tools :

Informatica

O/s :

Unix

a) Project Description

DB BE is standardizing its tools for ODS and other data warehouse platforms & Informatica is chosen as the ETL tool. Oracle 10g on Solaris OS has been made as the platform to host ODS and the other data warehouse. It is understood that Informatica will be used for data migration for systems.

b) Contribution

Collaborating with business users to define the key business requirements and translate them into process/technical solutions.

Suggest steps for best possible use of processes for new components.

Configuring configuration management processes and setting up versioned and global repositories in Informatica.

Designed and developed the informatica mappings for source system data extraction, data staging, transformation, validation and error handling.

Involved with error handling & performance tuning strategy for informatica ETL process.

Project Domain

Banking

Sub Domain

Consumer Lending & Cards

Solution

Loans

Project Name:

ARSM

Client

Deutsche Bank, London

Role

Team Member

Organization

Tata Consultancy Services Ltd.

Location

Offshore: Bangalore

Duration

Offshore: July/2004--October/2005

Team Size

Project : 5

Environment

(with skill versions)

Languages:

SQL & PL/SQL

Database :

Oracle9i

Tools :

Toad

O/s :

Sun Solaris 2.6

a) Project Description

The Advance Ratio suggestion Module (ARSM) is a tool for DB to calculate the advance ratio for loans based on different securities & their market data.

b) Contribution

Review, Testing and implementations of code. Fixing bugs as per Client requirements. Scheduling & monitoring jobs for production batches.

Project Name:

ETL for Monthly Profitability Analysis Application

Client

Escorts Limited, India

Role

Team Member

Organization

Synergy Informatics

Location

Offshore: Bangalore

Duration

Offshore: February/2001--July/2004

Team Size

Project : 5

Environment

(with skill versions)

Languages:

SQL & PL/SQL

Skills :

Oracle 8i, RCS with Windows NT

a) Project Description

Escorts Limited is an organization, which trades in Escorts motors for domestic & international customers. The aim of this project to bring all Escorts businesses on single platform for managing store/retail operations where I have developed Procedure, Functions, Packages, Mappings, Mapplets & Interfaces.



Contact this candidate