Technical Exposure
●IaaS
GCP Cloud Storage, GCP AutoML/Natural Language, GCP BigQuery, GCP Cloud Functions, GCP Pub/Sub, AWS EC2, S3, Glue, DMS, RDS, IAM, CloudFormation, Cloudwatch
●Tools
Databricks, Tableau, Jupyter, Terraform, Apache Kafka, Docker, Kubernetes, Jenkins, Microsoft Dynamics CRM, SSRS, Power BI, Salesforce Administration, Agile, Confluence, Jira, Microsoft Office
●Languages
Python, Java, SQL, HTML, JavaScript, XML, JSON, C, Assembly, Prolog
Education/Certifications
●Bachelors in Computer Science George Mason University (2014-2019)
●Certified GCP Data Engineer
●Certified AWS Cloud Practitioner
Experience
Aptive Resources- (Client: Us Custom & Border Protection) June 2024- Present
Helped Create potential ETL solutions using tools such as Databricks, Snowflake, Redshift, Apache Spark, Python, SQL
Provided efficient, budget friendly, user friendly and FEDRAMP approved options for the stakeholders
Provided advice on tools and technologies usable for streamed and batch processed data from multiple social media sites and other internet outlets associated with CBP
Apogee Integration- (Client: Department of Homeland Security) January 2023- June 2024
Creating an Apache Kafka and Tableau Server API solution to sync user information and data between an internal tool and Tableau
Using Jenkins and Docker to create an automated and containerized version of the Tableau User and Data Sync solution
Helped create an automated solution to improve tableau server performance using Jenkins, Tableau API, Python, and shell
Created Jupyter Notebooks to accelerate testing of code that analyzes data and information being received from the border
Implemented tableau Personal Access Token usage for pre-existing solutions to improve security
RestonLogic- Cloud Engineer (Client: US Custom & Border Protection) September 2020-October 2022
●Created and evaluated end-to-end solutions using GCP ML technology to extract data from 10m+ documents
●Used Databricks to create ETL processes of 4m+ data points being streamed daily from the US Borders
●Used AWS DMS and AWS Cloudwatch to migrate data from Databases to AWS and constantly track the data being streamed.
●Provided clean and transformed data through Databricks for reports that are created for the White House
●Created new jobs and Clusters in Databricks to automate ETL process and manage 2500 databases
●Created Policies and Clusters in Databricks to provide proper permissions to users
●Used Google Cloud to create Document Classification and Extraction system POC for the CBP
●Created AutoML/Natural Language Classification and Extraction Datasets and Models
●Created Cloud Functions to use the Natural Language Models to classify and extract data from documents
●Created Bigquery tables to sort and manipulate extracted data from documents
●Used GCP Pub/Sub to automate the Classification and Extraction process
●Used GCP Cloud storage to store and sort documents
●Used Terraform to deploy Cloud Functions, create Buckets and turn on/off AutoML Models
●Used AWS RDS and an internally developed tool to mask, synthesize and store databases
Software Developer at NTT Data (Client: Department of Justice) June 2019 – September 2020
●Used Microsoft Dynamics CRM to create applications for the DOJ.
●Helped create a complaint system organizing and handling complaints submitted against Judges. Worked with business process flows, business workflows; created entities, built forms, customizations and reports.
●Led meetings and discussions with clients and presented updates to the application.
●Created SSRS reports using Visual Studio and XML for Dynamics CRM Applications.
●Created Power BI Dashboard and reports.
Skills and Activities
●Captain/ President/ Lead Choreographer for Mason Ke Rang - Nationally competing fusion dance team (2014 - 2018).
●Board Member - Tu Bhi Nachle Indian Classical-Fusion Dance Competition (2016 - 2018).
Clearance
Currently hold a DHS CBP BI Clearance