... Must have at least 3 years of ETL Design and Development experience using Ab Initio Data Integration project experience on Hadoop Platform, preferably Cloudera At least one project implementation of AbInitio CDC ( Change Data Capture ) in a Data ... - Apr 26
... Experience with big data technologies like Hadoop, Spark and Hive. AWS experience (S3 and EMR). Proficiency in Hive / Spark SQL / SQL. Experience with Spark. Experience with one or more programming languages like Java & Scala. Ability to push the ... - May 01
... Familiarity utilizing virtualization and distributed field systems, such as Hadoop (or similar distributed file systems) in development and deployment environments; Familiarity using git, svn, JIRA, or other version control technologies; Experience ... - Apr 11
... Willingness to be a committer/contributor to open source applications Java programming for distributed systems, with experience in networking and multi-threading Apache Hadoop Apache Accumulo Apache NiFi Agile development experience Well-grounded in ... - Apr 20
... Must have at least 3 years of ETL Design and Development experience using Ab Initio Data Integration project experience on Hadoop Platform, preferably Cloudera At least one project implementation of AbInitio CDC ( Change Data Capture ) in a Data ... - Apr 25
... Experience with: Framewords such as Hadoop, Spark, Kafka, etc. AWS Technologies: RDS, Lambda, SNS, SQS, AppSync, EventBridge Containerization and orchestration tools such as Docker and Kubernetes. - Apr 27
... A successful candidate for this position has experience working with large Hadoop and Accumulo based clusters and interfacing with hardware, software, security, and network teams. Experience with Bash, Python, and configuration management tools such ... - Apr 22
... In this role, you will be working in a cloud environment platform, built with Java on free and open-source software products including Kubernetes, Hadoop and Accumulo, to enable the execution of data-intensive analytics on a managed infrastructure. ... - May 02
... Experience with large data management techniques such as partitioning, multi-threading, parquet, Spark/Hadoop, etc. Familiarity with GIS (Geographic Information Systems) software and spatial analytics tools for mapping and analyzing geospatial data ... - May 02
... Familiarity with big data frameworks like Hadoop, Spark, or Kafka for processing and analyzing large volumes of data. Additional information Interview format: 1st – Phone Screen, 2nd – MS Teams Video Interview Please send resume to: - Apr 20