Post Job Free

Resume

Sign in

Data Customer Service

Location:
Texas
Posted:
March 15, 2017

Contact this candidate

Resume:

SAAD SIDDIQUE

Phone: +1-409-***-**** E-mail: aczbbj@r.postjobfree.com LinkedIn: linkedin.com/in/saad-siddique-90517430

Address: ***** ********** ***** ********* *** Houston, Texas 77069

PROFESSIONAL SUMMARY

Certified Hortonworks Hadoop Associate and have a year of experience in development and administration with Hadoop.

Three years of experience in Telecom industry focused on Data analytics.

Strong Knowledge and experience in Hadoop and Big Data Ecosystem including Ambari, MapReduce, YARN, HDFS, Hive, Spark, Zookeeper, Sqoop, Falcon, Oozie, Ranger and Solr.

Four years of experience in Data and Statistical Analytics, Database systems development with exposure to Database design, Data mining, warehousing and ETL.

Extensive knowledge and work experience in Statistical modeling, predictive analysis, data processing using Python and R.

Four years of experience in business intelligence and Dashboard reporting with Excel, Tableau, Qliksense and Zeppelin.

Experienced in interacting with business users and understanding of gathering requirements, developing detailed functional specs through JAD sessions and business requirement documents across the deliverables of a project.

Extensive experience in project management best practices, processes, & methodologies.

Technical Skills

Development/OS

Python, R, JAVA, SQL, shell scripting, Red Hat Linux, CentOS, Ubuntu

Hortonworks Hadoop

Administration with Ambari, Security with Ranger, Data governance with Atlas, waterline, Falcon, Oozie, Data access with Hive, Solr, Spark and data management with HDFS.

Data Analytics & BI

MS Excel, Tableau, Qliksense, Apache Zeppelin, IBM Cognos, R, SSRS, Informatica

Concepts and Methodologies

Big data, Data management, Data cleaning, Relational databases, Data warehousing, Data analysis, ETL, Waterfall, Agile, Six Sigma, SOX,

Project Management.

Education

2015-2016

Masters in Computer Science, Major: Data Science & Databases GPA: 4.00

Lamar University Beaumont Texas 77701.

2008-2012

Bachelor’s in Engineering. Major: Electrical (Telecommunication) Engineering GPA: 3.13

National University of Sciences and Technology.

Work Experience

Hewlett Packard Enterprise, Houston TX October 2016 – Present

Big Data Engineer

Responsibilities:

Working in big data support team to make sure all the Hortonworks Hadoop clusters are running without any issues.

Responsible for Cluster administration, maintenance, Cluster Monitoring, Troubleshooting issues.

Working with Data science team for predictive analytics on performance metrics of big data platforms.

Worked with engineering team to plan and deploy new Hadoop environments and expand existing Hadoop clusters.

Upgraded HDP and Ambari on multiple clusters and installation of HDP services on the cluster.

Wrote shell scripts for creating automated cron jobs for efficient cluster administration.

Worked with data ingestion teams for data migration from EDW to Hadoop using Falcon, Oozie and Sqoop.

Involved in Analyzing system failures, identifying root causes, and recommended course of actions. Documented the systems processes and procedures for future references.

Accomplishments:

Presented weekly training sessions on Hortonworks Hadoop and other big data services to my team.

Created Qliksense and Apache Zeppelin Dashboards of Oozie and Falcon data ingestion jobs for efficient monitoring.

Worked on predictive Analytics on performance metrics for big data platforms for Center of Excellence team.

TransCanada, Houston TX May 2016 – October 2016

Intern

Responsibilities:

Developed Cost planning, forecasting and tracking tool for IT department at TransCanada.

Developed and automating multiple departmental Reports using Tableau Software and MS Excel.

Extracted data from IBM Cognos to create automated visualization reports and dashboards on Tableau.

Executed business critical projects with analytical and operational decision making involving multiple teams.

Interacted with the Business Users for gathering design requirements and taking feedback on improvements.

Assisted team members in day to day activities and tasks related to data analysis and reporting.

Lamar University, Beaumont, TX August 2015 – May 2016

Graduate Assistant & SQL Developer

Worked as teacher assistant for Database design and mining courses and involved in an initiative of school of Computer Science to create a central database for all of their current student’s educational background, degrees, interest, current and past professional positions. The vision of this project was to build brand and encourage networking opportunities for graduating students.

Responsibilities:

Worked as Graduate teaching assistant for the courses of Database Design and Data mining.

Participated in requirement definition, target state objectives, analysis, and design of the project student’s database and finalized the design of backend database structure, export structures, attributes, access processes.

Worked on creation of scripts for data imports from distributed data sources and created complex SQL queries, tables, views, functions and procedures in SQL server based on the design and requirements.

Ufone-Etisalat (Telecom Carrier) October 2012 – December 2014

Executive Engineer

Ufone is one of the top Telecom operator in the nation, serving 20 million customers nationwide. It is the part of Etisalat group which is one of the largest mobile operators in the world.

Responsibilities:

Worked in coordination with marketing, marketing BI, Strategy, Sales and Technical Operation teams.

KPI (Key Performance Index) analysis for optimal network performance and Preparation of Dashboard reports on daily, weekly and monthly basis using multiple performance analysis tools and SQL databases.

Written complex SQL queries to pull Customer data for statistical analysis and presented in interactive reports.

Forecasting and predictive analysis for network and user data usage based on the user trends.

Created Excel VBA macros for automated reporting on network KPIs and Customer data

Deployment of Huawei Smart Care SAS Solution: Network Performance Analysis, KPI analysis & User Management Tool.

Designed SQL database architecture after gathering inputs from multiple departments for Smart Care SAS solution and integrated multiple databases using SSIS and ETL principles to create tables and views.

Developed Standard reports, Dashboard Reports, Charts, Drill through Reports in the Smart Care SAS solution.

Integrated Customer service CRM with the Smart Care platform and SQL database to provide real time customer details.

Accomplishments:

Automated all the departmental reports using Excel VBA, macros and SQL scripts saving 1000 man hours per year.

Lead $1 million project of Smart Care SAS system and successfully delivered it by meeting all the project goals.

Initiated 4 improvement projects which resulted in more efficiency and saved man hours.

Projects

Deployment of Qliksense and Zeppelin Dashboard for Hadoop data ingestion project at HPE (Ambari, mysql, Qliksense hub, Apache Zeppelin):

As a part of data migration project from Enterprise Data warehouse to Hortonworks Hadoop data lake, created Dashboard reports for Oozie and Falcon jobs on Qliksense hub and Apache Zeppelin notebook.

Predictive Analytics on performance metrics for Big data platforms at HPE (Python, R, Apache Spark, Zeppelin, Qlik):

Written python and R code for predictive analysis on performance metrics of big data platforms using ARIMA modeling on

Apache Zeppelin and Apache Spark. The project was focused to predict the performance metrics in the future to see the future trend and take corrective actions.

Sentiment Analysis of Twitter users using Hadoop Ecosystem (Hadoop, NiFi, Solr, Hive, R, Excel, Tableau).

A tool for real time sentiment analysis of twitter users based on specific topics. Integrated twitter API with Hadoop platform using Apache NiFi dataflow tools (ETL pipeline) and saved them in HDFS. Data preprocessing using R and Solr. Created Hive database for the tweets. Processed the data using HQL (like SQL) on top of Map-reduce. Developed algorithms to calculate the sentiments for each tweet. Multidimensional visualization and analysis of the results using tools like Excel PowerPoint and Tableau.

Deployment of Smart Care SAS platform at Ufone Telecom (MS SQL Server, Java, SOAP, XML, SSIS, Sybase, CRM):

A web based tool Network Performance Analysis, KPI analysis, User data Management Tool and reporting services. Integrated the tool with existing databases using SSIS and ETL procedures. Developed Standard reports, Dashboard Reports, Charts, Drill through Reports and created procedure for real time queries to CRM platform.

Inverted indexing of data records using Hadoop (Hadoop, HDFS, Solr, and Visual Studio).

Downloaded the 2.5 m Amazon book records in the HDFS. Created schema for indexing is apache Solr. Indexed the records in Solr redundancy and replication factor. Created a web based interface in visual studio to access the records with multiple search criteria.



Contact this candidate