Post Job Free

Hadoop jobs in Hartford, CT

Sign in
Search for: Jobs   Resumes


distance:
Job alert Jobs 1 - 10 of 11

Data Engineer with PySpark

Infowave Systems, Inc  –  Rocky Hill, CT, 06067
... Knowledge of big data technologies (e.g., Hadoop, Spark). Familiarity with data warehousing concepts and tools (e.g., Snowflake, Redshift). Understanding of machine learning concepts and algorithms. Why Join Us: Opportunity to work on cutting-edge ... - Apr 30

Data Engineer with PySpark

Infowave Systems, Inc  –  Rocky Hill, CT, 06067
... Knowledge of big data technologies (e.g., Hadoop, Spark). Familiarity with data warehousing concepts and tools (e.g., Snowflake, Redshift). Understanding of machine learning concepts and algorithms. Why Join Us: Opportunity to work on cutting-edge ... - Apr 30

Data Engineer (Hartford, CT)

The Hartford  –  Hartford, CT
... In addition, position requires 1 year of experience in the following skills: Hadoop (Sqoop, Hive, NoSQL); ETL Processes; SQL. Telecommuting available. Salary Range: $129,500.00 - $146,400.00 per YEAR Contact: Apply online at Compensation The listed ... - Apr 29

Staff Data Engineer (Hartford, CT)

The Hartford  –  Hartford, CT
... Research, experiment, and utilize leading big data methodologies (AWS, Python-PySpark, Hadoop/EMR, Snowflake, Oracle DB, Talend, Informatica ETL, Kafka) with cloud/on premise hybrid hosting solutions, on a project level. Implement, and test data ... - Apr 29

Director - EDS Platform Solution Architecture

The Hartford  –  Hartford, CT
... Hands on experience with data platforms, ETL, BI/Analytics tools including Informatica, Snowflake, AWS EMR or open-source tools like Spark, PySpark, Hadoop etc. Knowledge of architecture frameworks (TOGAF etc.) or AWS Well-architected framework, ... - Apr 09

Senior Data Scientist

Cvshealth  –  Hartford, CT
... Demonstrated expertise in working with distributed computing frameworks such as Hadoop and Spark. Strong analytical and problem-solving skills, with a passion for leveraging data to drive actionable insights and inform strategic decisions. Ability ... - Apr 30

Cloud Data Engineer SQL Python ETL Cloud )

Cvshealth  –  Hartford, CT
... and deploying data pipelines 2+ years' experience using Python and any ETL tool Working experience in Relational databases or Hadoop with strong understanding of SQL or HQL Hands on experience in building solutions in public cloud environments (GCP, ... - Apr 30

Associate Cloud Data Engineer SQL Python ETL, Cloud )

Cvshealth  –  Hartford, CT
... and deploying data pipelines 1+ years' experience using Python and any ETL tool Working experience in Relational databases or Hadoop with strong understanding of SQL or HQL Hands on experience in building solutions in public cloud environments (GCP, ... - Apr 30

Principal Data Architect

Cvshealth  –  Hartford, CT
... Experience in migration from On-Prem Data Warehouse (like DB2, Hadoop) to Cloud Data Warehouse (like GCP, Snowflake) is preferred. Education Bachelors Degree in Computer Science, information systems or related field or equivelant experience required ... - Apr 16

Sales Director

Super Micro Computer  –  Hartford, CT
Job Req ID: 23773 About Supermicro: Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We ... - Apr 30
1 2 Next