SAUMIL SHETH
Big Data / Hadoop Developer
Phone: (1) 732-***-****
E-Mail: *****.******@*****.***
Professional Summary:
11+ years of IT experience in software design, development, testing and Data Analytics.
2+ years of experience in developing Big Data sets applications using the Hadoop (CDH), HDFS, Spark, scala, Map Reduce, Hive, pig.
Having experience in Kafka and different streaming.
Having experience of Cloudera and Horton works distributions.
Hands on experience on NoSQL Database Hbase, Cassandra, Vertica, Mongo DB.
Hands on experience with importing and exporting data from Relational data bases to HDFS, Hive and HBase using Sqoop.
6+ years of Development experience in ETL methodologies using IBM Websphere Datastage for extraction, transformation, manipulation and aggregation of data, NETeXPERT rule writing, centura team developer.
7+ years of hands on experience on various database technology - Oracle, MySQL, Teradata, BTEQ script and PL/SQL.
6+ yeas of hands on experience in Data Analytics of telecom network, various Call detail records including E911, predictive data analysis, time series data analysis.
Experience in database designing using E-R Model, database programming skills in Oracle and data modeling.
Successfully conducted demo to clients to ensure quality of product is as expected
Experience in deploying code, conduct post deployment activities by coordination with cross functional teams.
Provide production support, Development Integration test, User Acceptance testing and aid in root cause analysis for trouble tickets.
Goal oriented and innovative with excellent people-skills and ability to manage change with ease including diverse situations, technologies and domains.
Excellent problem solving and strong analytical skills.
As part of Agile team, participate in scrum grooming sessions, plan and estimate capacity vs availability of resources and task.
Education: B.E in Computer Engineering, Mumbai University, 2004.
Certification: Big Data specialization by University of California, San Diego on Coursera.
Technical Skills / Knowledge base:
Big Data technology and its ecosystem
Hadoop, Map Reduce, Spark 1.x.x, Pig, Hive, splunk, Neo4j 2.3.2, KNIME 3.1.x, Hue, oozie, zookeeper, sqoop 1.4.x, R.
Scripting language
Python, Scala, shell, Java.
Database
Oracle, PL/SQL, MySQL, Teradata.
NoSql
HBase 1.x, Cassandra, Mongo DB, Vertica.
Reports
Crystal Reports, Cognos Report.
Other Tools
TIBCO BPM, TIBCO BW, HP Quality center, JIRA, I-TRACK, Rally Dev.
OSS COTS Product
NETeXPERT 6.2, Clarity, MAXIMO (IBM Tivoli).
Experience:
I.Client Name : AT&T, New Jersey (TechMahindra America Inc.)
Duration : Mar’14 to till date.
Role : Big Data Developer
Project : Service Quality Management
SQM platform build on Hadoop distributed file system to fetch real time network data and calculate KPIs/KQIs.
Developed and Tested multiple MapReduce jobs in Spark / scala / python for data cleaning, data transform, filter, modify, join, merge from different network data source which can be used for further processing.
Develop and integrate jobs with Holt-Winters algorithm implemented by research for real time ‘time series’ event/alarm analysis and KPI monitoring.
Write and review data mapping and designed data modeling for new KPIs in Hive. The mapping gives detail on how the data will move from sources to target.
Involved in finalizing DAG/flow diagram of map reduce program with principle architect.
Monitors DAG visualization, event timeline and other matrix of task/jobs on Spark UI.
Troubleshoot performance issues in Spark Jobs by tuning the code.
Involved in import/export data using sqoop from RDBMS to HDFS.
Involved in creating Hive tables.
Build HiveQL for data validation and testing.
Create shell script in UNIX/LINUX environments to automate data flow and transfer
Understand and test network topology, inventory and KPI aggregation methods.
Review requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
Responsible for L2 support.
II.Client Name : AT&T, New Jersey (TechMahindra America Inc./India)
Duration : Nov’10 to Feb’14.
Role : Lead ETL / Software Developer
Project : VOLTE/LTE RAN/UMTS/ IR94
With progression of AT&T network towards UMTS/LTE/VOLTE/IR94 technologies, there is a need to enhance supporting IT infrastructure for management of these networks and overlaying services. The KPIs are developed using performance counters from various network elements and Call detail records (CDR) of both regular and (E)911 calls.
Developed DataStage Job utilizing sequential/complex file stage, transform, filter, modify, join, merge, remove duplicates.
Develop store procedure (PL/SQL) and script to dynamically monitor data quality of processed data by comparing it with raw data to detect issues like data mismatch, data loss, KPI calculation errors, wrong data aggregation.
Create views using DB links for accessing data from another Database.
Analyze variety of CDR data on Teradata/Vertica across various network elements to validate call flows (Mobility) operating on different 2G/3G/VO-LTE technologies
Implemented apache Kafka on Cambria (UEB) services.
Troubleshoot data to report any gap in algorithms, KPI calculation, call pattern issues to stack holders.
Carryout analysis on BTEQ script using simulated LAB Call detail records as well as live network calls for AT&T.
Review Data Mapping, Data model, logic for data flow, extraction and aggregation and Data retention policies with principle architect, development and test team of new KPI/deliverables for E2E system.
Review requirements in each iteration, define tasks, estimate efforts and deliver within timelines.
Check Data Quality of E2E System including ETL tool, Predictive analytics layer and different integrated system.
Understand and test Algorithm to identify different call patterns (mobility) and its complex KPI calculation.
Conduct client demo prior and after scheduled product releases to get client feedback as part of Agile methodologies.
Coordinate production deployment activities, support ORT phase for users, conduct RCA on production tickets and track it.
Train resources, Lead offshore team.
III.Client Name : Telecom Fiji Ltd., Fiji (TechMahindra Ltd.)
Duration : May’08 - Oct’10
Product : NetExpert 6.2 (FMS, PMS, DMP)
Role : Lead Software Developer
Project includes full E2E BSS/OSS system implementation.
Delivered OSS southbound integration includes NetExpert SA suit- Fault, Performance & SLA module and northbound integration with Remedy.
Handled Go Live activities. Co-ordinate onsite and offshore activities.
Handled System Administration part of NetExpert which include Installation, configuration and integration of different modules (FMS, PMS and DMP) at SIT, UAT and Production environment.
Executed SIT & UAT plan at onsite.
Conducted NetExpert application and SOP Training for Client in Fiji.
Involved in preparation of test cases, training documents and final handover documents to be delivered to Client.
Developed south bound interface in NetExpert with various EMS like Marconi, Provision, Alcatel1353, AXE Switches and NGN for FMS and PMS.
Developed of cluster monitoring and start up script catering to fail over scenarios.
Designed and Developed Alarm co-relation policies and Escalation policy using DMP.
Designed and developed Fault & Performance reports in Crystal reports.
Designed and developed Heartbeat monitoring functionality.
Done requirement gathering, BRS preparation, Integration with NX for NGN implementation.
IV.Client Name : Globe Telecom, Philippines (Techmahindra Ltd.)
Duration : Sep’07 - Apr’08
Product : Clarity, Service Assurance
Role : Software Developer
First phase of project involved complete End to End implementation of Clarity service assurance suite for 3G wireless services which include SLA, Alarm, Fault, Network Diagrammer modules.
Involved in requirement gathering and analysis phase.
Solution design, implementation, configuration and testing of SLA, FAULT, Alarm, Network Diagrammer modules.
Designed & developed COGNOS reports.
Prepared UAT Test cases.
V.Client Name : PIPEX Wireless, UK (TechMahindra Ltd.)
Duration : Jun’07 - Aug’07
Product : MAXIMO, AXIOSS
Role : Software Developer
MAXIMO is BSS product of IBM which was integrated with AXIOSS, web portal etc. and use to manage WFM, TT Management, SLA and preventive maintenance.
Customized product modules of MAXIMO for WFM, TT and SLA manager.
Designed and developed database tables, store procedure and triggers.
Prepared UAT Test cases.
VI.Client Name : Sri Lanka Telecom (TechMahindra Ltd.)
Duration : Aug’06 - May’07
Project : Customer Web Portal Development
Role : Software Developer
Customer Web Portal is a single interface for user for accessing the Fault, Performance and SLA Report of OSS
Preparing Functional Specification Document.
Develop portal and solve their queries for Functional Issue.
Preparation of Test cases and Test Data.
VII.Internal POC on TIBCO BW and Web Method with Clarity(OSS)
Design system Integration for Clarity using TIBCO BW and via web method
Documentation of All API’s and their parameter details including STATE diagram of NMS.
Design and Test Integration of Clarity (OSS) using TIBCO BW and Web Method.
2.Company Name : Reliance Communication, NNOC, India.
Duration : Jan’05 – Jul’06
Role : System Analyst & Team Leader
Team : Enterprise Broadband Support System
Project : Order Management system.
Working on OMS in IBS is a core layer, which uses TIBCO BPM Engine set along with adapter to integrate end applications used in the solution implementation.
Analyze/Monitoring Order uploading which include customer, location & service creation, MACD operation which include order creation at OSS and updating all BSS layer application after Closer of order and also maintain SLA for same.
Error Analysis/handling in end-to-end Order uploading and MACD operation by checking DB and different system logs.
Deployment of new Release/Patch in Production, Performing UAT for end-to-end checking properly in production environment as per SRS.
Analyze OMS layer operation, CPU & Memory utilization of Server for enhancement of system resources and server scaling for Business Requirement and smooth operation.
Raise Production defect, Change Request as per requirement in eBSS.
Define Rule base in TIBCO Hawk for Error detection and monitoring TIBCO operation.
Monitoring of OMS interface with ADC, CRM, eNIMS, eCMDE, Payment Engine (BSS) & Clarity(OSS).
Ensuring Data consistency at all application in BSS layer and OSS layer & performing Data Reconciliation process.
Generate requirement to Automate operation/process at OMS layer.
Involving in BPM engine & Server split activity with DBA and Developer.
Maintain order xmls, logs & OMS portal, and Inconcert server. Checking of Database analyses, Data purging Activities, tuning of heavy loaded Query with help of DBA & Developer.
Update SOPs, give training to team members, and Report generation.
Generate testing requirement in ST.
Our main goal is to make our system Zero Defect, Zero Delay and Zero touch.
3.Company Name : Nascent Compute Solution India Pvt. Ltd.
Duration : July’04 - Jan’05
Role : Sr. Software Programmer
Project Name : COMBOS, COMNET
I.Working on COMBOS Software (Back Office system) used by Share Broker, Sub Broker after transaction completed in stock exchange.
Design & Develop Modules using CTD & SQL Server-2000.
Develop Report using Centura Report Builder.
Develop Procedure in SQL which help for application and Report Generation.
Develop Modules for import and export file to SQL 2000 from Stock Exchange.
Write Module using Visual Basic to upload file from MS - Excel to Database.
Update Existing Module as per Requirement.
II.Working on COMNET online application used for online transaction. www.technogroup.com
In this data entering, Reporting, Billing done by any branches or client
Develop website using ASP and Scripting Language.
Developing Procedure in SQL help for on line Report generation.
4.Project Name : Insurance Management Software(Intern 2002)
Language : Fox-Pro (2.6)
This is Single User Software. It manages all insurance Policy entry and Premium entry of client.
Report Generation is design for monthly activity, Premium detail, Pending Payment etc.
Additional Information:
Awarded Best team player in OSS- Fiji project implementation at Tech Mahindra.
Awarded letter of appreciation from OSS-PIPEX at Tech Mahindra.
Attended certified training on NetExpert Development, system administration at Tech Mahindra.
Attended certified training on IBM Tivoli at Tech Mahindra.