(multiple positions available)
Build Extract, Transform, and Load (ETL) processes to allow data to flow seamlessly from source to target. Load and enhance dimensional data models, perform code reviews, and participate in sprint ceremonies. Leverage code to apply business rules to ensure data is clean and is interpreted correctly by all business stakeholders. Analyze reports using various tools. Provide support to offshore operations teams, train operations teams on ETL processes, and support user questions on data management processes and results. Troubleshoot and address issues with data or ETL process. Fine tune existing code to make processes more efficient. Maintain and create documentation to describe data management processes. Hybrid work schedule.
Five years of experience as a solution architect or closely related position. Must have experience with: Java/J2EE, Scala, Python, JavaScript; Spring 3.X, Hibernate 3.X, JSP, Servlet, EJB, JPA, Web Services; Hadoop, HDFS, Map Reduce, HDFS, HBase, Hive, Oozie Job, Sqoop, Spark, Kafka, NIFI; Hortonworks Data Platform, Azure HDInsight, Azure cloud, Amazon EMR, Data bricks; Azure Data Factory, ADLS Gen2, Azure data bricks, Key Vault, Azure Table Storage, AZ copy, SQL Warehouse, Synapse, Azure Search; Oracle, MySQL, SQL Server, PostgreSQL; Spring-XD, HAWQ, Redshift; JIRA, Git, Mercurial, SVN, Maven, Azure DevOps, Terraform.
Bachelor’s degree or foreign equivalent in Computers or Engineering
Please copy and paste your resume in the email body (do not send attachments, we cannot open them) and email it to candidates at placementservicesusa.com with reference #0251345 in the subject line.