AHMAD ABUADAS
SOFTWARE&DATA ENGINEER
Summary
Ambitious problem-solving with over a decade of
experience in software engineering, I've spent the last 7 years specializing in Data and Big Data
Engineering. I completed my studies with a master’s in information technology, specializing in Big Data. This academic journey provided me with the
theoretical knowledge that pairs well with my
hands-on experience in the field.
Skills
Education
• MASTER OF INFORMATION TECHNOLOGY,
BIG DATA Walden University, Minneapolis, MN
2018 – 2020
• Bachelor of Electrical Engineering
Balqa Applied University, Jordan
2007 - 2012
Personal information
Name: Ahmad Jamal Abuadas
Work Auth: US Citizen
Phone: 850-***-****
Email : ad3pmi@r.postjobfree.com
Activities and Honors
o Created and optimized PySpark logic for complex
data transformations, significantly improving
efficiency and accuracy in reconciliation processes. o Support production issue for ETL/ELT jobs, spark jobs, database issues Debugging and fix issues.
o Utilized automation tools such as Autosys, Jenkins, and RLM for scheduling and orchestrating data jobs, resulting in improved job reliability and minimized downtime.
o Successfully integrated Continuous
Delivery/Continuous Integration (CDCI) practices into the automation framework, ensuring seamless
development, testing, and deployment of data
solutions.
o Played a crucial role in ensuring the stability and scalability of reconciliation processes, allowing the business to adapt to evolving market conditions.
o Setting up AWS EC2, S3, IAM within the Kelvin
platform testing process Designed bridges to
connect with devices for data transmission using
Python scripts.
o Developed POV UI integrated with the Kelvin
platform using Python (Streamlit)
o Collaborated with the product team to create Kelvin Maps by set up control system OPC configuration to extract data, connecting it to OPC Bridge/MQTT
Bridge for integration into the Kelvin platform.
o Developed EDHR and RMT system real-time data-
driven applications using PLC code, DeviseWise, and the Oracle database.
o Developed YAS application, a real-time data-driven application, extracting data from the vision system using Python, implementing ML and Text mining
algorithms for data analysis, and visualizing it using R code.
o Developed Alarm and event tracking system, a real- time data-driven application extracting data from
the SQL database using Python.
o development ETL pipline and loading data in to
Mongo and Postgres databases as backend for web
applications.
o Developed a Spare Part Management System using
Ruby on Rails and JS.
o Utilized PLC coding and HMI applications to store sensitive data, meeting business requirements.
o Utilized Databricks to run schedule jobs flow for sql query & pyspark jobs use ingestion tool at Databricks to connect with S3 and automated triggers for job
flow.
o Utilized HADOOP, Spark, and Scala to implement
robust data processing solutions, enhancing system performance and scalability.
o Create ETL jobs to load and mange the data from
logs, systems, ODBC in to Mongo and Postgres
databases.
Hadoop
Components /
Big Data
HDFS, Hue, MapReduce, Pig, Hive, Hbase,
Sqoop, Impala, Zookeeper, Kafka, Yarn,
Cloudera Manager,,pyspark Airflow, Kafka,
Snowflake, Scala
Languages
Python(Tensoflow, PyTorch, Keras, Numpy,
SCipy, NLTK, Gensim, SpaCy,Pasndas,
Matplotlib, Plotly), Linux shell scripts,
R, Ruby, JS,C#, Java, SQL, Hive,
Spark(SQL, streaming, ML\AI)
IDE Tools Visual Studio Code, Jupyter lab, Rstudio Cloud platform AWS (Lambda, Redshift, S3, EC2, EMR, RDS, AWS Glue, Kinesis, AM,)Databricks
Reporting
Tools
Tableau, RShiny, Streamlit, SSRS, Excel
Databases
MSQLServer,MySQL,Postgres,Oracle,
NoSQL Database (MongoDB, Hbase, Cassandra)
Data Analysis Pandas, NumPy, SciPy, Scikit-learn, RShiny, streamlit, Plotly, Matplotlib,
CI/CD Tools Jenkins, RLM, Bitbucket, Autosys, Airflow Software
Methodologies
Agile, Scrum, Waterfall
Big Data
Ecosystem
MapReduce, HDFS, HBase, Spark, Scala,
Zookeeper, Hive, Pig, Sqoop Cassandra, Oozie,
MongoDB, Flume, Databricks
ETL Tools Alteryx, DeviceWise, Talend, Airflow, DBT Visual
diagrams tools
Visio, Miro, Notaion
other Support production, fixing bugs, check logs, tacking tickets
Experience
Big DATA ENGINEER
Citi Group via Doran Jones, FL SEP\2022- Current
- Leveraged cutting-edge Big Data tools and technologies, including Hadoop Distributed File System (HDFS), Apache Spark, Apache Hive, and Hue, to process and analyze large-scale financial data efficiently.
- Collaborated with cross-functional teams to understand project requirements and formulated data engineering strategies to meet business objectives.
- Created and optimized PySpark logic to perform complex data transformations and calculations, resulting in improved efficiency and accuracy in reconciliation processes.
- Employed SQL for data querying and manipulation, ensuring data quality and adherence to regulatory standards.
- Utilized automation tools such as Autosys, Jenkins, and RLM for scheduling and orchestrating data jobs, improving job reliability and minimizing downtime.
- Developed shell scripts to automate routine tasks, enhance system performance, and streamline data processes.
- Integrated Continuous Delivery/Continuous Integration (CDCI) practices into the automation framework, ensuring seamless development, testing, and deployment of data solutions.
- Successfully delivered high-performance data solutions, reducing reconciliation discrepancies and enhancing overall data quality.
- Played a crucial role in ensuring the stability and scalability of the reconciliation processes, allowing the business to adapt to evolving market conditions. OT/IT ENGINEER
Kelvin INC, CA 2022- 2022(6 months)
- Set up AWS EC2 within the kelvin platform.
- Design bridges to connect with devices in sending data to the Kelvin platform using python.
- Discover use cases and develop POV UI integrated with the kelvin platform using python(streamlit)
- Working with the product team to develop Kelvin Maps.
- Set up control system OPC configuration to extract the data and connected it to OPC Bridge\MQTT Bridge in order to use it in the Kelvin platform. SOFTWARE DATA ENGINEER
Johnson & Johnson vision care via TATA Group, FL 2019 - 2022
- Developing a data-driven application using Python, SQL query writing, and R studio to extract processes and visualize the data.
- Implements solutions and performs design for many real-time data software such as EDHR(Electronic Device History Record), RMT(Raw material tracking), and YAS(Yield Analyses System) by setup the data transaction triggers using PLC, DeviseWise, and Oracle database.
- Design fully managed ETL process using python, R, Devise wise, and PLC code.
- working across multiple environments (Dev, QA, UAT, Prod, etc.) to generate and check log files, and set up monitoring tools.
- Design the ETL process using SQL on Snowflake, python, DeviseWise, and Alteryx.
- Developing automation Rockwell PLC software for 5th generation of contact lenses including Developing SCADA HMI (Human-machine interface) software.
- Setup Enterprise network architecture, dataflow, and data layer drawing (using Visio) SENIOR ENGINEER
Fibertex Non-Woven, IL 2017 - 2019
- Developing Spare Part Management system using (SQL database, Ruby on Rails, and JS) Maintain and troubleshoot Control systems, including PLC controllers AB, Siemens PLC's and PLC components, Simatic step7, Total Integrated Automation (TIA), RS Logix 5000
- Hand experience in Rockwell automation mentoring software such as Studio5000, FactoryTalk, HMI client, Diagnostics Viewer, FactoryTalk Admin console, and OPC server.
- Working on the safety system by implementing the IQ for safety PLCs, and designing UI on the HMI shows the safety element status to improve the safety of the control systems.