The role requires English language skills, and candidates to be Europe based for timezone purposes.
Key Responsibilities:
- Design and implement secure, scalable, and high-performance data pipelines using GCP tools such as BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage
- Collaborate with data analysts, data scientists, and business stakeholders to deliver data solutions that meet banking-specific needs
- Develop and manage batch and streaming ETL/ELT processes to ingest data from internal systems, APIs, and third-party providers
- Ensure compliance with banking regulations and internal data governance policies (e.g., GDPR)
- Implement robust monitoring, logging, and data quality frameworks
- Support cloud data warehouse solutions and participate in cloud migration projects
Desirable Requirements:
- Strong background in Data Engineering
- Experience with Terraform would be very beneficial
- GCP experience
- Python and SQL experience
- Experience with stakeholder management would be very beneficial for the role
The role offers the flexibility of a remote work arrangement and is a 6 month contract with possibility of extension.