One of our long-standing consulting partners is seeking an experienced Data Engineer to help build and operate reliable data platforms for organisations going through digital transformation. They value practical engineering, straightforward architecture, and solutions that hold up in real-world use.

Key Responsibilities:
  • Build and maintain scalable data pipelines across cloud and on-prem systems.
  • Integrate data from diverse sources in a repeatable, stable way.
  • Design and test distributed processing workflows for large datasets.
  • Collaborate with data scientists to support model development and deployment.
  • Evaluate performance and cost implications of different solution designs.
  • Produce clear dashboards and visualisations for end users.
Requirements:
  • Strong background in large-scale data engineering and distributed systems.
  • Several years' experience in data or software engineering with a focus on sustainable, high-quality code.
  • Proficiency in Python and SQL; plus experience or interest in technologies like Spark, Hadoop, Hive, Airflow, RDBMS, NoSQL, Kubernetes, DevOps tooling, Java or .NET.
  • Good understanding of AWS, Azure, or GCP.
  • Degree in Computer Science, Mathematics, or a related field.
  • Fluent in German and English
High Level:
  • Zurich 2/3 days home office per week
  • An additional 2 weeks of fully remote work per year
  • Compensation 130-140k CHF