Syed Hafeez Syed Zaker
Email: ada6wp@r.postjobfree.com
Contact: +919*********
Address: Flat No 502 Prime Enclave, Near Sparkling Beauty - Paramount Colony, 8-1-398/PM/330, Paramount Hills, IAS Colony, Toli Chowki, Hyderabad, Telangana-500008.
Objective:
Willing to work in an organization where I can apply my technical expertise while learning new technologies along the way.
Professional summary:
Passionate about finding positive solutions to complex data problems with a qualitative delivery structure.
Over 5+ Years of professional IT Experience working as a Big Data Engineer in customer centric organizations, with domain knowledge of big data analytics and machine learning.
Installing, configuring, monitoring and maintaining Apache Hadoop clusters in the cloud using various distributions including Hortonworks and Cloudera.
Experience in working with Hadoop ecosystem components: HDFS, Hive, PIG, Sqoop, Flume, Zookeeper, Yarn management and Oozie.
Experience in working on Hadoop Cluster including cluster planning, designing, implementing, bench marking, performance tuning and monitoring.
Hands on experience in Linux administration – user management, storage management, network management and shell scripting.
Strong knowledge on Hadoop HDFS architecture and Map Reduce Framework.
Experience in working with large volumes of streaming data using Flume and Kafka.
Tuning SPARK and YARN clusters to utilize maximum resources from the available resources.
Advanced Hands on Experience in troubleshooting operational issues and identifying root causes of a Hadoop cluster.
Exposure to handling Linux/Unix Shell Scripting for automation process.
Experience in Working on Cloud Environments like AWS, GCP.
Working experience with DevOps practices in application technology including continuous integration and delivery processes using Jenkins, Bamboo, code deploy.
Experience in Presenting Project/Product demo for clients/end users with creative presentation and documentation.
A Big Data enthusiast, have taken multiple sessions on Hadoop technologies on Corporate as well as academic level, passionate to learn new technologies.
Excellent documentation skills including designing of UML diagrams.
Good experience in Agile Methodologies and Scrum.
Work Experience:
Organization : Infosys Private Limited
Designation : Senior Associate Consultant – Big Data Hadoop
Duration : From November 2019 to Until Dates.
Job Role and Responsibilities:
Develop a strategy the team will use to reach its goal
Provide any training that team members need
Communicate clear instructions to team members
Listen to team members' feedback
Monitor team members' participation to ensure the training they are being provided is being put into use, and also to see if any additional training is needed
Manage the flow of day-to-day operations
Create reports to update the company on the team's progress
Distribute reports to the appropriate personnel
Organization : GetInCity IT Services Private Limited
Designation : Hadoop Administrator
Duration : From August 2014 to November 2019.
Job Role and Responsibilities:
Processing large data sets, assisting in hardware architecture.
Capable of planning and estimating cluster capacity and creating roadmaps for Hadoop cluster deployment.
Setting up, installing, configuring, maintaining and monitoring HDFS, Yarn, Flume, Sqoop, Pig, Hive, Oozie.
Performing cluster tuning, cluster monitoring and troubleshooting.
Troubleshoot and debug Hadoop Eco system runtime issues.
Experience in Commissioning and De-commissioning, Trash configuration, node balancer.
Evaluation of Hadoop infrastructure requirements and design/deploy solutions (high availability, big data clusters, etc.)
Backups, Snapshots and recovery from node failure.
Installation of various components and daemons of Hadoop eco-system.
Importing/Exporting data to/from RDBMS.
Loading Data from S3 to HDFS and vice versa, assisting in creation of data pipeline.
Projects:
Digital Marketing (Ad-Tech)
-The Project was an On-premise/In-house cluster.
-It has 1330 nodes with 20 services.
-It was an Apache Hadoop cluster.
-The core services were HDFS, YARN, HIVE, HUE, SCOOP, FLUME, ZOOKEEPER, OOZIE, KAFKA, SPARK, STORM.
Health-Care:
-It was Cloud-Based cluster, namely it was created on AWS cloud.
-It has 30 nodes with 15 services.
-It was a Cloudera Hadoop cluster.
-The core services were HDFS, YARN, HIVE, HUE, SQOOP, FLUME, ZOOKEEPER, OOZIE, KAFKA, STREAMSETS, LDAP, SENTRY, MIT-KERBEROS.
Logistics:
-It was Cloud-Based cluster, namely it was created on AWS cloud.
-It has 20 nodes with 10 services.
-It was a Hortonworks Hadoop cluster.
-The core services were HDFS, YARN, HIVE, SCOOP, FLUME, ZOOKEEPER, OOZIE, KAFKA, RANGER, AD-KERBEROS.
Educational Qualification:
M.E(CSE) Everest Engineering College - 2018
B.E (CSE) P.E.S College of Engineering - 2014
Diploma (CSE) M.I.T College of Engineering - 2011
S.S.C Maharashtra State Board - 2007
Personal:
D.O.B. – 2 June 1992
Marital Status – Unmarried
City – Aurangabad
Passport ID – M5231120