SHEKH FIROZ ALAM
Senior Data Engineer
Email id: adhj91@r.postjobfree.com, Skype Id shekh.firoz.alam
OBJECTIVE
Seeking challenging job role with an expanding and dynamic organization, where I can implement the skills that I have gained through my past experience, as well as to enhance my knowledge by dedication and hard work.
SUMMARY
Senior Data Engineer with 8+ years of IT experience focused mainly on Data-pipeline, Data-Ingestion, extensive data analysis, design, development, testing, and implementation of Big data analytics & solutions using Python, Big Data within a wide range of industries like Telecom, Food ordering, Security System.
Strong hands on experience in programming languages like Python, Hive, Spark SQL, Spark-Streaming
Expertise in Big Data Cloudera(CDH) Administration and cluster implementation (1200 nodes)
Adding, and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, and other Hadoop management functions
Hadoop cluster designing & performance tuning and optimizations
Extensive experience on ETL (NIFI,SQOOP,ODI) using open source tools
Having very good experience in cloud computing namely GCP,AZURE(data factory,blob,synapse),DATABRICKS
Expertise in extracting data from various disparate data sources & (.xls, .csv, .txt) file format and databases like Oracle, SQL Server, MySQL Server, NoSQL Databases (MongDb, Hbase,Cosmos db).
Data pipeline using different tool and technology and write it to HDFS.
A good team player with excellent interpersonal and communication skills to interact with Business analysts, Project managers, Developers, Customers and Stake holders.
Experience in machine learning techniques like Logistic Regression, Random Forest, Decision Trees, Support Vector Machines and Clustering.
Hadoop, Hive, Spark, Sentry, HBase, Sqoop, Impala,Kafka, Flume, Oozie, MapReduce,, ZooKeeper
Overview in data visualization tools Aracdia.
Good experience in Data Collection, querying, modifying and analysis, coding, summarizing the findings, and presenting the results.
CERTIFICATION
IKM – International Knowledge Measurement. Python with MySQL
EDUCATION QUALIFICATION
Bachelor’s degree (B. Tech) in Information Technology from New Delhi .
12th from Doon Public School, CBSE New Delhi.
10th from Air Force School, CBSE, Uttar Pradesh.
TECHNICAL SKILLS
Data Lake
:
Snowflake,Blob,databricks
Big Data
:
Cloudera,Hadoop, SparkSQL,Spark-Streaming,Linux,Shell-Scripting,HDFS,Hive
MQ
:
Kafka, Flume.
ELK
:
Elastic Search, Kibana, Logstash.
ML(Algorithm)
:
Supervise, Un-Supervise Learning, Reinforce Learning.
Working Skills
:
Python, Hive, Spark (Sql, Core, ML).
Prog Language
:
Python, Java(spring boot), Scala, SQL
Databases
:
MySQL,Postgres,SqlServer,Oracle 10g, NOSQL, Mongo DB, Hbase.
Ide’s
:
STS, Eclipse, PyCharm,Anconda
Os:
:
Macintosh, Linux, RedHat6.9, Windows.
Cloud
:
GCP,Azure(Blob)AWS, EC2,Redshift,S3,EMR,Gcp.
Visualization Tool
:
Arcadia (Open Source),Tableau,MSBI,QlickSense
Tools
:
GitHub, Jira, Maven
PROFESSIONAL EXPERIENCE
Experience in UK (LONDON):
Working for full time for up work.please find below URL for upwork hiring
https://www.upwork.com/fl/shekhfiroz
Experience in KSA(Riyadh):
COMPANY: Innovative Software Solutions (from Jan 2019 to till date)
SUMMARY: Innovative Solutions is a well-established & leading Cybersecurity Services company in the GCC region having headquartered in Riyadh with regional offices in Jeddah, Al Khobar, Abu Dhabi and Dubai. They are trusted partner in providing world-class tailored & purposeful Cybersecurity Products, Solutions and Services combined with Strategic Consulting that exceeds our clients’ expectations.
Roles and Responsibilities:
I am working on Cyber Watch Product which will be responsible for Cyber Security, threat detection, Enrichment, Alerts, Alarming.
I am using Kafka, Apache Metron, many open source technologies to achieve my goal.
We are using elk stack to visualize data and report to security operation team is there is any fraud may occur.
My role will be to develop end to end product and bring it to production.
Experience in India:
COMPANY: Micro Spark Software Solutions (August 2017 to Dec 2018)
SUMMARY: Micro Spark Software Solutions is an innovative hub providing career consultation and solution. With a progressing responsibility to quality, the Company’s proceeding with objective is to improve its position as one of the main providers of IT, ITES and Non-IT businesses resources.
CLIENT I: VODAFONE GERMANY
SUMMARY: Vodafone is leading telecom company around the Globe, Vodafone Germany is one of the best service providers in Germany.
Project 1: Implementing Big Data and Data Analytics for Vodafone Germany
Role: Data Engineer
Organized and actively participated in the workshops and meetings with the stakeholders at various.
Levels for gathering the business requirements.
Data preparation is done in Python eliminating the null and missing values.
Imported pandas, NumPy, sci-learn, matplotlib, sea born packages to analyze, and calculate the sum of profits on each business day, in a month, and quarterly basis.
Analyzed and used advanced Machine learning techniques to understand correlation between the variables (i.e. marketing spends) which helped the business to achieve the goals and gain profits.
Used Linear Regression, Decision trees & Random forests, in comparing accuracies.
Created various visualizations using matplotlib and seaborne as per the business requirements.
Presented the modeling results and data insights to the managers and stakeholders.
Project 2: Elisa Telecom Finland (Implementation of Big Data Cluster)
SUMMARY: Elisa is Telecom in Finland with Fiber and telecommunication across the I-land
Role: Data Engineer
Involved in analysis, design, development and Implementation of Big Data Ecosystem.
Collaborated with Technical Architect in Choosing TechStack.
Helped the team in formulating and implementing the team's technical vision, providing hands-on insight into system architecture, data software development and data warehouse design.
Worked as data modeler for logical and physical data models.
Determined appropriateness of data storage, archive, purge and backup strategies in HDFS.
Working closely with the ETL and BI reporting team to guide, educate and tune queries.
After Analyzing and cleaning of CDR Files had be persisted in HBase.
COMPANY: AMDOCS Development Centre(Telecommunication) (Dec 2016 till August 2017)
Project 1: Three.ie (Implementation of Big Data and Data Analytics)
SUMMARY: Three is 100% owned by CK Hutchison Holdings Limited (CK Hutchison), a renowned multinational conglomerate committed to innovation and technology with businesses spanning the globe.
Role: Data Engineer
Setup pipeline of data sources using Cloudera big data stack node to store the data.
Analyzed cdr files using,Spark & Machine Learning (Decision Tree, Random forest).
Spark in Cloudera Distribution Analyzed the cdr files and suggested the mobile top-up plan to business in Three.ie (Three Ireland).
my role is to write Shell/Python scripts and automate.
All the Billing jobs of AMC (job monitoring tool), At regular interval we have to update Amc tool using Python/Django framework for smooth work flow of jobs.
Project 1: British Telecom (Implementation of Big Data and Data Analytics)
SUMMARY: Three is 100% owned by CK Hutchison Holdings Limited (CK Hutchison), a renowned multinational conglomerate committed to innovation and technology with businesses spanning the globe.
Role: Data Engineer
In British Telecom (Everything Every Where, Ensemble, Abp, Amc), I had analyzed the cdr files using Big Data(Spark and Hive) deep learning(Svm, Random Forest, clustering ).
most of the time we will go with clustering and random forest algorithm to get insights.
My role is to keep up and running the Amdocs Monitor Control Tool to up and running of the jobs Amc is heart of in production of billing.
Beside Big Data I have to work in Automation.
COMPANY: APPSTER Information Technology (March 2015 to May 2016)
SUMMARY: Appster work across the entire lifecycle of taking an idea to market to support entrepreneurs & challenger brands with ideation, validation, product strategy, engineering, ongoing maintenance and growth.
Project 1: 2STONES
Role: Software Engineer
Developed Restful Web services, CMS Admin Panel using Spring3.0 and JPA and other technology and frame work as per client requirement.
2Stone is of E-Commerce mobile application which is selling products around different states in Australia.
My Role is to develop restful services where customer can purchase articles, return or request money back.
Admin panel for Admin's of application using Spring-MVC frame work As admin will verify the seller’s products before listing it on application.
PROJECT 2 SIMPLY SPORTS
Simply Sports is an Application which notify about the sports you follow and need updates on regular time as admin put add RSS feed from Admin Panel all the user will get notified.
My Role is to develop Restful Services (Jersey). Admin panel for Admin's of application, using MVC Design Pattern.
Deploy war in WebLogic Application server.
COMPANY: Kundoor Consultancy Services. (JAN 2014 till FEB 2015)
PROJECT DOCTOR CONNECT
SUMMARY: Doctor Connect is mobile and web application by which any person can call or video the doctor and consult about his problem. Doctor write medicine and medicine will deliver to patient house. everything was automated.
Role: Software Trainee
My role is Developed backend i.e. spring restful services there will be 3 tiers.
Super Admin, AppAdmin, Mobile user like doctor signup, upload doctor qualifications.
Admin will verify license of practice and approve doctor’s profile.
We have use WebLogic application server to deploy in production.
My role was end to end from development, debugging, and deployment.
PERSONAL PROFILE
DOB
:
02-02-1990
Languages known
:
Proficiency in English, Understand Arabic
Nationality
:
Indian
Iqama Status
:
Transferable
Religion
:
Islam