We are currently seeking an experienced  Data Engineer to join a leading bank in Switzerland. This exciting opportunity will involve helping one of Switzerland’s leading banks build, secure and optimise data pipelines to ensure reliable data flow across the business

The role requires English language skills, and candidates to be based in Zurich or a commutable distance, as the role is hybrid.

Key Responsibilities:
  • Design, develop, and maintain ETL/ELT pipelines using Informatica PowerCenter/IDQ and Apache Spark.
  • Collaborate with business stakeholders, data analysts, and data scientists to gather requirements and translate them into scalable data solutions.
  • Work with structured and unstructured data from multiple banking domains (e.g., retail, risk, compliance, payments).
  • Implement best practices for data integration, quality, lineage, and governance to meet banking regulatory and compliance standards.
  • Optimise Spark jobs and workflows for performance, scalability, and cost efficiency.
  • Partner with infrastructure and cloud teams to support deployment and monitoring of data pipelines on-premises and/or in cloud environments (AWS/Azure/GCP).
  • Develop and maintain technical documentation, data dictionaries, and process workflows.
  • Support production operations by troubleshooting, resolving issues, and implementing preventive measures
 
Desirable Requirements:
 
  • Strong background in Data Engineering
  • Experience with Spark
  • Informatica experience
  • Python and SQL experience
 
The role offers the flexibility of a hybrid work arrangement and is a 6-12 month contract with possibility of extension.