Post Job Free
Sign in

Senior Application Developer (focus on Data Engineering)

Company:
Callista ACE AG
Location:
Wabern, Bern, 3084, Switzerland
Posted:
November 06, 2025
Apply

Description:

Are you passionate about data, modern cloud technologies, and clean code?

Then an exciting opportunity awaits you at the intersection of software development and data engineering.

Working in an agile Scrum team, you will contribute to the further development of the data platform and create scalable, high-performance backend solutions that intelligently automate and optimize data processes.

Tasks: Development and maintenance of scalable, efficient backend solutions using Python and PySpark in a Databricks environment Extension and maintenance of existing backend components (e.g., transformation and test engine) Implementation of unit tests with high test coverage and integration into CI/CD pipelines (GitLab, trunk-based development) Working in a heterogeneous environment with REST APIs, Oracle databases, file imports and Docker containers Automation of data preparation processes and local workflows with Dagster (or comparable tools such as Airflow or Databricks Workflows) Identifying technical gaps and creating operationally driven stories Translation of business requirements into technical specifications, documentation and user stories Active participation in the agile Scrum team, including code reviews, technical discussions and continuous product improvement (DevOps) Support in the operation and further development of the data platform in a cloud environment (Azure is an advantage) Other activities: Implementation of modern data processing concepts such as ETL/ELT, Data Warehouse, Data Lake, Lakehouse and Medallion architecture, including performance optimization . Modeling and further development of star and snowflake data models for efficient data analysis and provision Requirements: Completed studies or training in the field of computer science, data science or business informatics Solid experience in backend development with Python (PySpark experience is an advantage) Knowledge of CI/CD pipelines (GitLab), Git and DevOps practices Experience with SQL and Oracle databases Experience in cloud environments (Azure is an advantage) Familiar with orchestration tools such as Dagster, Airflow, or Databricks workflows Good knowledge of modern data architectures (ETL/ELT, DWH, Lakehouse) Enjoyment of agile collaboration and technical discussions Very good German and English skills General framework conditions: Start: January 2026 - December 2026 Syllabus: 60% Location: Bern, mandatory on-site attendance twice a week

Apply