Job Description
The successful candidate will leverage their development skills and experience to support the successful ingestion, cleansing, transformation, loading, and display of significant amounts of data, with a particular focus on Cloud data.
Responsibilities
Designing and implementing large-scale ingest systems in a Big Data environment
Optimizing all stages of the data lifecycle, from initial planning, to ingest, through final display and beyond
Designing and implementing data extraction, cleansing, transformation, loading, and replication/distribution
Developing custom solutions/code to ingest and exploit new and existing data sources
Developing data profiling, deduping logic, and matching logic for analysis
Organizing and maintaining data layer documentation, so others are able to understand and use it
Collaborating with teammates, other service providers, vendors, and users to develop new and more efficient methods
Effectively articulating the risks and constraints associated with software solutions, based on environmentRequired Skills
Strong software development experience, to include significant Java development, data analysis/parsing, and SQL/database experience
Strong experience with the full data lifecycle, from ingest through display, in a Big Data environment
Strong experience with Java-related technologies, such as JDK, J2EE, EJB, JDBC, and/or Spring, and experience with RESTful APIs
Experience developing and performing ETL tasks in Linux and/or Cloud environments
TS/SCI clearance with a polygraph Bachelor's degree in Computer ScienceDesired Skills
Experience with Hadoop, Hbase, MapReduce
Experience with Elasticsearch
Experience working in a mission environment and/or with many different types of data
Full-time