Post Job Free
Sign in

Member of Technical Staff - Data Platform

Company:
Nile Global Inc
Location:
San Jose, CA
Posted:
May 25, 2025
Apply

Description:

Job Description

Company Overview:

At Nile, we envision an enterprise network that inherently defends against cyber threats, eliminates lateral attack vectors like ransomware, and operates free of complexity. Our goal is to deliver Campus Network-as-a-Service (NaaS) that makes network operations virtually invisible to our customers by pushing the boundaries of autonomy. Imagine a network that continuously monitors, optimizes, and upgrades itself—all without the need for human intervention. Our audacious journey began in 2018 when we brought together a team of industry veterans and visionaries in networking, cybersecurity, cloud software, and AI to disrupt a $100 billion enterprise networking market, starting with the wired and wireless LAN. Today, our Nile Access Service is redefining connectivity as a service for organizations worldwide, from cutting-edge technology companies to leading healthcare and financial institutions, and beyond.

Where do we go from here? Well, that’s where you come in. We are expanding in all areas, bringing in some of the brightest talent to further shape Nile’s future, prepare for growth, and tackle tough tasks to ensure our momentum never slows.

Job Description:

Strong software engineer to develop a bottom-up software solution on AWS/GCP cloud, leveraging cloud-native/open-source services.

Experience writing software from the ground up with minimal guidance and strong design skills. A passionate Software Engineer who loves writing high-quality code and open-source software.

Strong individual who can stand his ground on technical depth but collaborate with peers through their technical prowess and peer respect. Not looking to manage other engineers.

Relevant Experience:

Candidate has prior experience developing software or SaaS on the AWS/GCP platform from the ground up to scale (Cradle to grave) and hands-on experience using AWS/GCP services.

Experienced in stream data processing, encompassing data engineering and big data technologies such as Spark, Flink, Kafka, Druid, Iceberg, Deltalake, and Hudi.

Technologies:

Languages: Java, Go, deep familiarity with AWS/GCP cloud SW tools/services.

Expertise working with high volume real-time streaming (one of Flink, Beam, Spark).

Datalake/Lakehouse expertise or experience in Hudi, Iceberg, Delta Lake.

Expertise working in Java development, Springboot.

Expertise working with Relational DB (MySQL), Time-series (Druid), Elastic, Dynamo.

Microservices-based development, K8s and AWS, GCP knowledge preferred.

Quick learner of concepts and ability to deliver designs/solutions for problems independently. Networking knowledge is a plus.

Key Responsibilities:

Design and develop scalable streaming platform solutions.

Implement and maintain gRPC-based collectors for events/metrics/logs.

Work with protocol buffers and network telemetry data.

Develop and maintain containerized applications using Docker and Kubernetes.

Maintain and optimize existing ETL processes for performance and reliability.

Work with distributed systems and handle large-scale data processing.

Ensure data quality and consistency across different systems.

Education & Years of Experience:

BS/MS in Engineering or equivalent experience in Cloud software development.

5-10 years of experience with focus on Data intensive applications

Full-time

Apply