The Day to Day:
- Building and maintaining large-scale batch and streaming pipelines to process millions of daily events.
- Designing data products that power analytics, reporting, and applications.
- Improving platform scalability, reliability, and developer experience.
- Working closely with analysts and business stakeholders to deliver practical, impactful solutions.
- Hands-on experience in large-scale data processing (Azure Databricks or Apache Spark)
- Knowledge of SQL, PowerBi, or web analytics tools (e.g. Google Analytics) is a plus.
- Fluent English, with good German or French.
- Previous experience with AWS or GCP is welcome!
- Work in a dynamic, diverse, cross-regional team.
- Influence the organisation’s digital transformation journey.
- Modern cloud technologies and innovative Data & AI-based projects.
- Super Flexible working arrangements and an inclusive culture promoting continuous learning