Post Job Free

Resume

Sign in

Data Engineer

Location:
Vasant Nagar, Karnataka, India
Posted:
May 21, 2020

Contact this candidate

Resume:

Techno-functional professional with expertise in independently leading projects by understanding the requirements and providing go-live support; targeting assignments with an organisation of high repute

Location Preference: Bangalore

Executive Profile

Offering approx. 11 years of experience in software consultancy in big database, technical project execution, application architecture

Proficient in grasping the big picture, conceptualizing, developing, and implementing solutions

Worked with data ingestion, cleaning and consolidation processes with large distributed systems (Big Data and Data Lakes) from an architecture and development perspective

Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications.

Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation.

Involved in designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.).

I involved in performing source system analysis.

I have used BI Tools like Tableau and Cognos.

Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse.

Implements versioning of the ETL repository and supporting code as necessary. I have worked in creating mappings in TALEND for more than 3 years using tMap, tJoin, tReplicate, tParallelize, tConvertType,, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

I have worked with Talend Big Data, Hadoop, Hive and used Talend Big data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

I have used file format XML and JSON for more than 3 years.

Excellence in writing Python & HQL scripts, developing business logic, preparing complex queries and developing customized business reports as per business requirement

Create logical and physical data models using best practices to ensure high data quality and reduced redundancy.

Maintain conceptual, logical and physical data models along with corresponding metadata.

Good knowledge of metadata management, data modeling, and related tools (Erwin and ER Studio)

Acquired expertise in working in Star schema and snowflake schema for more than 6 years.

Worked on setup backup methods for high availability of database and minimal downtime and complete recovery of data

Received pat on the back, value innovation, achiever of the quarter, team of the quarter and various other recognitions

An innovative & result-oriented professional with strong planning, communication, interpersonal & analytical skills

Education

B.Tech. in Electronics & Telecommunications Engineering from Biju Pattanayak University of Technology in 2008

Certification:

GCP Certified Professional

Since Nov’18

Infogain, Bangalore as Technology Lead

Key Result Areas:

Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.

I have created mapping documents for all Facts and Dimension tables

Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.

Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.

I participated in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications.

Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation.

I involved in designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.).

I involved in design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV,JSON,and Flat file structures.

I have used BI Tools like Tableau and Cognos.

Key Impact Areas

Informatica

Talend

Hadoop/Hive/Spark

Google Cloud

Requirement Gathering

SQL/PLSQL/Python/Shell

DWH/Data Modelling

Solution Design

Tableau/Cognos

Previous Experience

Jul’17 – Aug’18

KPIT Technology Limited, Bangalore as Tech Lead

Jun’14 – May’17

Capgemini Technology Services Limited, Mumbai as Senior Software Engineer

Sep’09– June’14

PWC (client), Mumbai as Software Engineer

Nov’08 –Aug’09

NR Switch N Radio Services Pvt. Ltd., Kota as Trainee Engineer

Major Projects (Recent 5)

At Infogain, Location as Technology Lead

Project Name- NCR Customer Insight

Period-2019-2020

Technology – Hive, Talend Enterprise Big Data Edition 6.4.1, Apache Dremio,,Bitbucket,Tableau

Description – The Project is about of building of new data lake in Hadoop using hive and create the virtual data set by using Dremio for the use of tableau and to give the input to spark ml

Project Name: Data Monetization

Period: 2018 to 2019

Technologies: Informatica Power Centre 10.x, Power exchange,Oracle 11g,, Toad, Control M, Putty, Winscp,Data modelling,Erwin studio,Cognos

Description: This project is about analysis of the various report published by various government body like CDC, FDA, ATSDR, EPA and WHO, along with reports need to be submitted by various provider to government regulatory environment

Project Name: Loyalty Bonus

Period: 2017 to 2018

Technologies: Hive, Talend Data Integration 6.4.1, Talend Administrator Console, esp scheduling tool, my SQL, netezza,data modelling,Tale,Oracle exadata,XML,JSON,Tableau

Description: A customer loyalty program is a structured and long-term marketing effort which provides incentives to repeat customers who demonstrate loyal buying behavior. Successful programs are designed to motivate customers in a business's target market to return often, make frequent purchases, and shun competitors.

Project Name: Quality Analytics Data

Period: 2016 to 2017

Technologies: Talend Data Integration 6.0.1, Json, Parquet,DB2,SQL,Unix, Talend Administrator Console,Tableau

Description: Quality Data Analytics is a unique combination of flexible online database query and browsing capabilities and advanced statistical analysis tools in one uniform, browser based analysis environment

Project Name: SMBC

Period: 2016 to 2017

Technologies: ETL Informatica, Oracle, SQL, PLQL, Unix,data Modelling,Cognos

Description: SMBC Project is integration of major Insurance company with SMBC for Loan insurance. The goal of this project was to integrate required functionalities in existing Finance mart & to apply different rules to allocate Accounting data at the lowest levels of Customer Structure hierarchy

S SEKHAR DAS

addc2w@r.postjobfree.com

+91-911*******

1.

Personal Details

Date of Birth: 1st July 1985

Language Known: English, Hindi & Odia

Address: Flat 305, Vakil Daffodils, Chandapura, Bangalore



Contact this candidate