Start date: ASAP
Duration: 6 months initially with a view to extend
Location: Remote (Need to be UK Based)
Rate: £400 - £530 per day outside ir35
Responsibilities
- Design, build, and maintain ETL/ELT pipelines and batch/streaming workflows.
- Integrate data from external APIs and internal systems into Snowflake and downstream tools.
- Use web scraping / browser automation to pull data from platforms that only have UI based data extract capabilities (no APIs)
- Own critical parts of our Airflow-based orchestration layer and Kafka-based event streams.
- Ensure data quality, reliability, and observability across our pipelines and platforms.
- Build shared data tools and frameworks to support analytics and reporting use cases.
- Partner closely with analysts, product managers, and other engineers to support data-driven decisions.
- 3 years of experience as a Data Engineer working on data infrastructure.
- Strong Python skills and hands-on experience with SQL
- Experience with modern orchestration tools like Airflow.
- Experience with APIs and extracting data from APIs.
- Understanding of data modelling, governance, and performance tuning in warehouse environments.
- Comfort operating in a cloud-native environment like AWS.
- Terraform experience.
- Snowflake
- Web scraping via browser automation (playwright / selenium / puppeteer for example)