Summary:
Over ** years of technical experience in IT from commercial and Government industries. Specializes in the application and integration of software development projects and web service projects.
Technical Skills:
Dev. Languages: Java, Python, NodeJS
Dev. Methodologies: Agile, Kanban
CI/CD Platforms: Git, Gitlab, BitBucket, Jenkins
Testing Frameworks: Sonarqube, Selenium, Mockito, Whitebox, Jmockit, Junit
Cloud Technologies: AWS (Dynamo, Lambda, ElastiCache, S3, ECS), Docker
Education/Training:
Johns Hopkins University, MS Computer Science 07/94
UMBC, BS Computer Science 08/91
Certifications:
AWS Certified Architect Associate – (SAA-C03) 12/2022
AWS Certified Developer Associate – (DVA-C01) 12/2022
CompTIA Security+ (Certified CE) – 11/2022
AWS Certified Database - Specialty 08/2021
Professional Experience Summary:
Leidos 08/19 – Present Lead SW Engineer
Working in the Division of Database Systems on project development and application support on various NoSQL databases and applications. Responsibilities include the following:
·Write Ansible scripts for deployment of Mongo, RDS, and Elasti-Cache updates and infrastructure updates including Linux.
·Working in an Agile environment where JIRA is used for bug tracking and Confluence is used for documentation.
·Addressed various Security Technical Implementation Guide (STIG) items that showed security flaws in deployed infrastructure.
·Wrote shell procedures to perform Point In Time Recovery for MongoDB Community edition.
·Developed the automation SonarQube security scans on various software components. Setup a local SonarCloud instance and automated the scanning and reporting. Established quality gates to ensure software met security guidelines.
·Automating the CheckMarx security scans using git, shell, and Java/ Python. The automation pulls down the code with git, builds the code with gradle and then uses REST to submit the built library to CheckMarx. Base scanning is looking for the log4j vulnerability in both the developed software and dependent libraries. Final execution environment is the AWS Code Build CI/CD pipeline.
·Lead the AWS Dynamo proof of concept that showed how to replace various MongoDB implementations. Proof of concept used Dynamo, Lambda, S3 and Glue. Developed various AWS Glue data pipelines to populate data in Dynamo. Lambda was used in the pipelines to move data from S3 to Dynamo.
Vision Technology Services/Leidos 02/18 – 08/19 Lead SW Engineer
Worked on the Electronic Records Management Architecture project. Responsibilities include the following:
·Performed Java development in a Kanban style Agile process with JIRA for bug tracking, Confluence for documentation and Jenkins CICD for automated build deployments.
·Wrote a Jenkins pipeline for legacy Java systems. The pipeline included stages for compilation, unit tests and static code analysis using Sonarqube. Addressed Java issues identified by Sonarqube.
·Wrote Java Unit Tests for legacy software systems written in Java Struts, JSPs and Spring MVC frameworks.
·Wrote CICD Jenkins Pipelines in Node.js to move images from an Artifactory server, tag and push the images into Docker and then upload them to a remote Reposistory in AWS ECR.
·Developed Java unit tests to support 65%+ Java code coverage using Java, PowerMockito, Whitebox, Mockito, JMockit and Junit. Tests are being developed to document functionality and to discover and fix Java code bugs. Jenkins CICD pipeline ran tests in a stage and produced Sonarqube coverage analysis.
·Used Git revision control to issue Pull Requests. Use feature branches for code development. Code is checked into BitBucket and the Jenkins pipeline runs the build and tests.
·Supported multiple MVC implementations including JSPs, Struts, EJB and Spring MVC.
·Tested SOAP projects using SOAP UI.
·Mentored junior developers on the technology stack.
Honore Consulting 08/17 – 02/18 Dev Ops Engineer
Working on remote contracts to help companies with AWS deployments. Responsibilities include the following:
·Automate and manage AWS infrastructure and deployment processes.
·Build AWS-based services supporting production SaaS platform including web applications and data analytic services
·Migrate existing on-premises services to an AWS cloud infrastructure.
·Develop cloud migration strategies and document procedures for AWS deployments
·Configured supported monitored Key Trustee Server with Apache Sentry within customers datacenter environments located offsite.
·Support AWS Cloud infrastructure automation with multiple tools including Ansible, Vagrant, Chef, Docker and monitoring tools such as Splunk, New Relic and Cloudwatch.
Amches 08/16 – 8/17 Lead Engineer
Assigned as a Lead Engineer on the SpringCloud project. The project analyzes satellite and wireless signals. Primary responsibilities included the following:
·Worked in an Agile environment using 2 week sprints. Used shirt sizing analogy to estimate work complexity.
·Lead engineer on SpringCloud Admin (SC Admin) tool which deploys SpringCloud to virtual hosts. SC Admin setups data flows, hosts and services. SC Admin uses GWT in the front end, Spring REST at the service endpoints and MySQL based backend. The admin project comprises cluster automation and building of Spring Cloud networks and services. Automation is done for both physical hardware and for VMs in a clustered environment (Machine Shop, AWS, and OpenStack).
oCreated scheduling for SC Admin and direct 2 junior software engineers and a test engineer.
oInterfaced with customer on SC Admin requirements, issues and project schedule.
oServed as ScrumMaster for the SC Admin sprints and led the spring retrospective meetings with the customer.
oBackend database integration was accomplished with Spring JDBC Template.
·Developed SpringCloud system architecture for the AWS deployment. Architecture consists of ETL, Web/Presentation, and Data Tier (Mongo, Accumulo, MySQL, and S3). Presented migration to other virtual systems like vSphere and OpenStack.
·Developed a MySQL to MongoDB Java migration tool.
·Performed MySQL backup and restore capabilities and migration strategy from MySQL 5.1 – 5.7.
Cloudera 03/15 – 08/16 Senior Engineer
Assigned as Senior Engineer in Access and Integration area. Primary responsibilities include the following:
·Assisted commercial and government customers with ETL strategies.
·Worked with Oozie, Hive, Impala, Sqoop, Pig, Flume and Java MapReduce.
·Assisted in the creation of software patches for Oozie, Hive, Impala, Sqoop, and Pig.
OmnyOn 06/14 – 03/15 Senior ETL Engineer
Assigned as Senior Engineer in ETL and Web Service Analytics. Primary responsibilities include the following:
·Created ETL processes using Oozie workflows from publication documents to Impala/Hive tables in Parquet format using Pig and Hive.
·Lead developer on a frontend widget that displays subject-verb-object (SVO) tuples from publication documents. Front end is Ext-JS and the backend is a Java REST service on top of Spring and Jersey. Backend datastore is Impala.
o Directed three junior engineers on ETL strategy.
oWorked directly with the customer on defining requirements and success evaluation.
·Worked with Data Scientists to define the data model for various analytic dashboards.
oFacilitated the integration points between the data scientists and the database engineers to allow for additional data stores and schemas to support the dashboards.
·Created the backend data store to support logging requirements for the SVO widget in MySQL. Store supports CRUD operations for logged search events using Java.
·Developed operations dashboard with Cloudera HUE/SOLR and Pig/Hive as ingest technologies.
·Integrated the enterprise community landing zone (CLZ) and the government cloud. Rebuilt the zone with CDH 5 and deployed the new application in C2S.
o Directed 2 engineers on the CLZ project and reported to upper management on my team and the CLZ team.
·Created XSLT and XPath for transforming pubs.xml documents into digital reasoning format and internal metadata format xml.
RedOwl Analytics 05/13 -- 06/14 Senior Engineer
Assigned as Senior Engineer in ETL and Behaviorial Inference Analytics. Primary responsibilities include the following:
·Worked as primary technical and business development lead on the InQTel (IQT) engagement. Wrote system design specification and integration documentation for the IQT contract.
oDirected three software engineers and a test engineer on the delivery of the software and the specification to the government customer.
·Performed monthly performance appraisals for two engineers who reported to me.
·Lead the reporting component of ETL by adding Java MapReduce programs to measure the number of records processed at each stage.
oTwo junior engineers reported to me on Sprint tasks on ETL strategies.
·Maintain and enhance Hadoop Map/Reduce programs for data ingest (ETL). Process is executed using Spring Batch as execution framework and executed in AWS with clusters spun up as needed. Data files for ingest are stored in S3 and copied to the AWS cluster.
·Wrote behaviorial inference analytics to determine possible outcome statistics for events based on an event history. Current events and event history are input as JSON stored in the cloud. Output results are recorded in ElasticSearch.
NSP 03/12 -- 03/13 Lead SW Engineer
Assigned as Lead Software Engineer and Analytic Cloud Architect . Primary responsibilities include the following:
·Designed and implemented a data abstraction layer for the Jersey endpoints. The abstraction layer provides access to multiple cloud data storage technologies (Cloudbase, Accumulo, Hbase).
·Writing a test suite to test and exercise all of the server side including the cloud access layer. The test suite simulates access to the cloud and mocks the server side calls.
·Wrote a test client in HTML/Javascript to read Google Protocol Buffers. The server side is a Spring Jersey controller that sends protobuffs to the web client.
·Wrote Pig scripts to safely remove data from a client database and clean both the index and metadata tables.
·Created a platform for ingest of test data from the cloud. Data is stored in Cloudbase and is being placed in MySQL using Pig and Sqoop.
·Designed and implemented a solution to analyze and store metrics data from the cloud to be stored in CloudBase. The metrics writer is being written in Java/Hadoop to output Sequence Files. The Sequence Files are being read and analyzed by Pig using ElephantBird and the end data repository is CloudBase.
·Designed and implemented a utility to read Google Protocol Buffers from the cloud using Pig. Data is filtered and formatted for Sqoop to store in a MySQL database.
Dynamic Technology Group 12/10 – 03/12 Lead SW Architect
Assigned as a Lead Software Architect for the Utility Cloud Manager (UCM). The UCM is a web application that manages virtual machines and cloud environments.
·Created detailed design of the UCM as a Spring MVC application. Views are JSPs with data transfer being both classic Java beans and XML. View dynamics are integrated with JQuery and CSS.
·Worked as the project manager on the UCM after the design was created. Led all software activities including requirements definition with the government customer, sprint retrospective and scheduling.
oLed the efforts of the software team which had 5backend engineers, 2 frontend engineers and 2 test engineers.
oInterfaced with both the hardware lead and external interfaces lead.
·Integrated the UCM VmWare vCloud and Smart Cloud provisioners.
·Designed and implemented provisioning layer for OpenStack and AWS.
·Architected UCM interface to vCloud Director and vmWare for creation, monitoring and sustainment of clouds, networks and virtual machines.
·Designed and developed the backend data store using MySQL and Hibernate as a data access technology. Used the Hibernate Template and annotations for table to object mappings.
·Designed and implemented a Spring RESTful Web Service to query a Perl process on the VM Ware statistics.
·Designed and implemented a Spring RESTful Web Service to integrate with SmartCloud cloud management software.
(References available upon request)