Post Job Free

Resume

Sign in

ETL Big Data Developer

Location:
Noida, Uttar Pradesh, India
Posted:
August 06, 2019

Contact this candidate

Resume:

PARAG DOSI

Email:ac9z48@r.postjobfree.com

Phone: (M) +91-837*******

PAN NO. : BIHPD7682P

PASSPORT NO.: M0151107

DOB : 02/02/1994

CAREER OBJECTIVE

To work in a healthy, innovative and challenging environment extracting the best out of me, that leads me to learn grow at professional as well as personal level.

SUMMARY

I am Big Data, BI, DWH & ETL Developer and a performance driven professional passionate about enhancing and Implementing my technical knowledge in order to achieve results. Having 5 years of experience in building effective business strategies, managing enterprise projects and building Innovative solutions with global teams, being solution-focused, takes delight in helping companies build solutions and Products for success.

I am a Responsible, reliable, and independent as well as team worker with high level of enthusiasm and creativity.

• Good knowledge of Big Data concept, BI, Data-warehousing and ETL tools like Talend Big Data 6.2/Talend DI 6.0/MDM 6.2/TAC, Informatica, sunopsis etc.

• Good knowledge of SQL concept like stored procedure, trigger, index etc. and UNIX command/script.

• Good knowledge of Big Data concept like Hive, pig, sqoop, Hbase, oozie, hdfs file like CSV, Parquet file, etc.

• Good working experience on MSSQL, Oracle 10g,Mariadb, oracle, sybase, postgresql, mysql and flat file with millions of record to handle and load in the data warehouse and Hadoop environment.

• Good knowledge of Web services and MDM, ESB component in talend.

• Good knowledge of Power BI as a BI tool.

• Experience of working on squirrel, GitHub, SVN, JIRA, SourceTree and alfresco.

• Experience of making and leading team.

• Capable of learning new technology/tools quickly and adapting to a new environment. WORK EXPERIENCE

Organization: Harman Connected Services Corp. India Pvt. Ltd. (January’17 to Present) Project 1:- [Jan’17 – present]

Client : Fishbowl

Environment : Unix, Hadoop

Technical Details : Talend Big Data 6.2, TAC, MSSQL, Maria DB, Hive, Hbase, Mapr 5.1 Distribution, Sqoop, GITHUB

Description : Fishbowl helps restaurants leverage data to drive predictable sales growth. Fishbowl’s closed loop restaurant marketing SaaS platform is highly scalable and ingests data quickly from various sources, including email, SMS, social, online ordering, loyalty programs, reservations, and more. The analytics platform uses industry-specific proprietary algorithms to provide clients with actionable insights about guests, menus, pricing, media mix, and social media. Roles & Responsibilities:

• Understanding the client and product requirements.

• Development and Modification of Talend Big Data jobs as per business requirements.

• Migration of data from older version of product(2.0) to product(3.0) through Talend Big Data job.

• Extensive experience in using multi format files like Input Text, Parquet File, XML, .csv, .txt and loading millions of data from these files in to Hadoop file system(HDFS).

• Working with MSSQL, MariaDb, Hive, Sqoop, pig, HDFS, FTP component in talend and Extract from this sources and load in the Hadoop File system for further process.

• To enhance the performance of existing Product data using talend big data batch job and pig job.

• Reviewing the tasks done by other team members before delivery.

• Training new team members for Talend suit including Knowledge Transfer sessions for all team members. Achievements:

• Setup a new team of Talend Big data and trained them in ETL and Bigdata. Organization: Sopra Steria Pvt. Ltd. (July’14 to December’16) Project 1:- [Nov’14 – Dec’16]

Client : ID Group, France

Environment : Windows

Technical Details : Talend DI 6.0, Sunopsis, Oracle 10g, Squirrel Description : ID Group is a leading company in the children's market, which encompasses 5 brands

(OBAIBI, OKAIDI, JACADI, OXYBUL and IDKIDS). The Project concerns with development cum migration tasks to handle raw data coming from different databases to perform various Transformations and Optimization of data to store it in a Data Warehouse so that the organized data can now be used for Analysis and Reporting purposes as per the business requirements. Roles & Responsibilities:

• Understanding the customer specifications and writing Technical Designs.

• Development and Modification of Talend jobs as per business requirements.

• Migration of Mappings from Sunopsis to Talend DI.

• Extensive experience in using multi format files like XML, .csv, .txt and loading millions of data from these files in to relational table.

• Working with Oracle, Sybase, postgre, mysql component in talend and Extract from this sources and load in the data warehouse for further process.

• Working on Web service, MDM and ESB components.

• To enhance the performance of existing Sunopsis system using TALEND COMPONENTS like tOracleOutputBulkExec etc., truncating tables before loading tables, creating indexes after loading of data so that it take less execution time.

• Reviewing the tasks done by other team members before delivery.

• Training new team members for Talend MDM including Knowledge Transfer sessions for all team members. Achievements:

• Global appreciation for creating automation tool on daily data tracker for project through Talend Tool.

• Got several Appreciations from client and onshore team for excellent work quality.

• Applauded in V1 meetings and Vertical level town halls for good productivity and less defects.

• Ranked 2nd in Sopra Internal Technical Assessments among people with same level of work-ex. Project 2:- [July’14 – Nov’14]

Client : BNP Paribas, France

Environment : Windows, Unix

Technical Details : Informatica 9.1, Oracle 10g, Putty Description : It is a Development cum maintenance project for maintaining Data Warehouse called Central Information Warehouse (CIW) that captures and provides data for analysis and reporting to Business Intelligence team. This Data Warehouse contains both referential and transactional data which further extends to Customer data, organizational data, Arrangement data, rating data, externally bought company data, Transactional data etc. Roles & Responsibilities:

• Loading source data into Snapshot tables and further from Snapshot tables to History and Delta tables using ETL processes.

• Creating and handling output flat files for reporting purpose.

• To perform Audit Logging and Error logging as standard Powercenter procedures.

• To clean history tables as part of housekeeping job.

• To enhance performance using push down optimization, truncating tables before loading snapshot tables, disabling and recreating indexes before and after loading of data.

• To perform automation testing and correction of errors. This is to cleanse the mapping and remove formatting errors before migrating the mapping to higher environments. ACADEMIC QUALIFICATIONS

Standard/Course School/College Board/University Passing year Aggregate B.Tech.

XII

Jaipur Engineering College

And Research Centre, Jaipur

Central Public Sr. Sec.

School, Udaipur

RTU, Kota

CBSE

2014

2010

76.12

71.0

X Govt. Sr. Sec. School,

Ganora(Banswara)

RBSE 2008 70.67

CERTIFICATIONS

Certified for IBM DB2 Academic Associate: DB2 Database and Application Fundamentals [Oct’13] WORKSHOPS

Attended ‘Microsoft Tech Days’, a workshop on Windows 8 by Microsoft at JECRC Jaipur [Jan’13] EXTRA CURRICULAR ACTIVITIES

Coordinator of all Team Building activities at project level in Sopra Steria India. [2014-present]

Core Team Member of the organizing committee of Renaissance - an annual national level technical cum cultural fest of JECRC Jaipur [2011-2013]

Event organizer for ‘Entrapment’ – Laser show event in Renaissance. [Mar’12]



Contact this candidate