Pankaj K Chaudhari
Flat no. **, Wing-A, Yashogandh Appt,
Nr Talera Hospital, Keshavnagar, Chinchwad, Pune, Pin - 411033
Cell: +91-956*******, E-Mail: *******************@*****.***
Hadoop Developer with 3.11 Years of Experience.
Objective
To work for an organization that will enable the best utilization of my resources and potential for achieving the organizational goals.
PROFESSIONAL EXPERIENCE
Capgemini India Pvt Ltd, Pune 2nd Apr 2015 – 15th Sept 2015
Associate Consultant
Project : Cdiscount(POC)
Environment : Hadoop, Hive, PIG, Impala, Spark, HBase, Talend Big Data Studio 6.0, Netezza
Responsibilities :
Push data as delimited files into HDFS using Talend Big data studio.
Usage of different Talend Hadoop Component like Hive, Pig, Spark.
Loading data into parquet files by applying transformation using Impala
Implemented and tested analytical solutions in Hadoop
Coordinated with ETL developers for preparation of hive and pig scripts.
Utilized SQL scripts for supporting existing applications.
Executing parameterized Pig, Hive, impala, and UNIX batches in Production.
Created HIVE/PIG scripts for ETL purpose
PRGX India Pvt Ltd, Pune Oct 2013 – 13th Mar 2015
Software Developer
Project : Wal-Mart, Sobeys, Amazon, Waitrose.
Role : Software Developer.
Environment : Hadoop, HDFS, Hive, Impala, Beeline, UNIX, Talend, As400, PIG
Responsibilities :
Load and transform data into HDFS from large set of structured data/ As400/Mainframe/Oracle/Sql server using Talend Big data studio.
Migration of Mainframe code to Hive/Impala.
Optimization and performance tuning of Hive QL, formatting table column using Hive functions
Have been involved in designing & creating hive tables to upload data in Hadoop and process like merging,sorting and creating, joining tables
Responsible for all the data flow and quality of data.
Automated the process using UNIX and prepare Standardize jobs for production.
Involved in design and implementation of standards for development, testing and deployment.
Responsible for analyzing and understanding of data Sources like AP, PO, POS, EDI, LI data.
Creating Hive/Impala/PIG Scripts, Identifying parameters as per requirement to apply transformation and perform Unit testing on data as per design.
Design, develop, unit test, scripts for data Items using Hive/Impala
Exporting final tables from HDFS to SQL server using SQOOP.
Assist in designing and development of ETL procedures as per business requirements for Retail Domain
PRGX India Pvt Ltd, Pune Dec 2012 – Oct 2013
ETL Developer
Project : Operational Data Services
Environment : Informatica, Talend, SQL Server 2008, Oracle 10g, proprietary Software
Responsibilities :
Restoring Oracle Data dump in Oracle Enterprise Manager.
Data extraction from ASCII, EBCDIC, Flat files & Oracle Data Dump/PUMP, AS 400.
Creating COBOL Layout,X2CJ file and Transforming data from source to Target Table using Talend, Informatica
Design, develop, unit test, and support ETL mappings and scripts for data marts using Talend. Checking & fixing delimiter in ASCII Files.
Loading data into As 400 and Perform Data conversion in proprietary Software, Talend
Testing loaded data with Raw Files, Oracle data in SQL Server.
Writing SQL Scripts as per business requirement to process loaded data.
Creating Stats, Logs and apply Data Transformation on data in SQL Server
PRGX India Pvt Ltd, Pune Dec 2011 – Dec 2012
Trainee Developer
Project : Waitrose
Role : ETL Developer.
Environment : SSMA, Talend, SQL Server 2008, Oracle 10g.
Responsibilities :
Restoring Oracle Data dump in Oracle Enterprise Manager.
Loading data from Oracle to SQL server using SSMA, Informatica
Converts Oracle database objects to SQL Server database objects then migrates data from Oracle to SQL Server using SSMA
Creating mapping from Source to Target in Talend.
Develop ETL job in Talend to load data from ASCII, EBCDIC,Flat files
Creating parameterize scripts in SQL to apply data Transformation.
Performing SQL transformation on loaded data.
NIIT Pvt Ltd, Pune April 2011 – Oct 2011
Internship
Project : Orion Star.
Role : Trainee ETL Developer.
Duration : April 2011 – Oct 2011.
Environment : SAS/BASE, SAS/OLAP, SAS/GRAPH, SAS/ETL, SAS/DIS, SAS/IMS, SAS/Reports.
Responsibilities :
Creating and Designing OLAP using SAS OLAP Cube Studio
Writing MDX queries as per Cube Dimension and level.
Registered libraries in Repository on SAS Metadata Server.
Performing various task using SAS Base and SAS/EG.
Designing Source, Job, Target using SAS OLAP Cube Studio and SAS/DIS.
Analyzing OLAP Using SAS OLAP Viewer and SAS Dataset using SAS/EG.
Generating Information Map and Reports Using SAS/IMS and SAS Web Report Studio and loading Report on Portal.
Education: Diploma in SAS Business Intelligence and Data Warehousing From NIIT.
BE (Information Technology) – North Maharashtra University- 67.53%.
HSC – Nasik Board - 70.00%.
SSC – Nasik Board - 75.06%.
Computing Skills:
•SAS Desktop Applications :
•SAS Enterprise Guide 4.0
•SAS Management Console 9.1
•SAS OLAP Cube Studio 9.1
•SAS Information Map Studio 3.1
•SAS Data Integration Studio 3.4
•SAS Add-In For Microsoft Office
•SQL Server Integration Studio(SSIS)
•SAS dfPower Studio.
•SAS Web Browser Applications:
•SAS Web Report Studio.
•SAS Information Delivery portal.
•SAS OLAP Viewer.
Software Languages : Hive, Impala, Spark, HBase, MangoDB, Pig SAS Base Programming.
Databases : SQL Server 2005, MS Access 2007, MS Excel 2007.
ETL Tools : Talend Big Data Studio, Talend, Informatica, SAS Data Loader, SAS DIS, SSIS, AS400.
Personal Details:
•Date of Birth : June 8th 1987.
•Gender : Male.
•Marital Status : Married.
•Language Know : English, Hindi and Marathi.