- Initial 6 Month Contract - Likely to be extended
- 100% Remote - Candidates must be based in UK or Europe
- 8 hours per day 5 days per week
- 350€ - €450 per day
- Pay rate is paid per day in Euros
- For UK Candidates - This position falls outside the IR35
- Start date is ASAP
I'm currently working with a global leading e-commerce brand to find a Data Engineer Contractor with and experience working with cloud technology (AWS) to join a team who are working on high-performance platforms and pipelines that are used by millions world-wide.
Ideally, we are looking for someone that would be able to start this project as soon as possible, however, we can accommodate a four week notice period.
- Design, and implement the data pipelines providing access to large datasets and transforming
power for data across the org
- Write complex but efficient code to transform curated data into business questions oriented
datasets and data visualizations.
- Work with big data and distributed systems using technologies such as Spark, AWS EMR, and Python.
- Actively contribute to the adoption of strong software architecture, development best practices,
and new technologies. We are always improving the process of building software; we need you to help contribute.
- Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using open sources and GCP big data technologies
- Explore and learn the latest GCP technologies to provide new capabilities and increase efficiency
- Collaborate with Business Users, Infra Engineers , Data Scientists to recognize and help adopt best practices in data gathering and transforming big data
- Identify, design and develop new tools and processes to improvise the data storage and compute to help the Data Engineering and Data Consumption teams and users
- Interface directly with stakeholders, gathering requirements and owning automated end-to-end data engineering solutions.
- Provide technical guidance and mentoring to other engineers for best practices on data engineering
- Work with the team to discuss the technical design and development needs.
- Bachelor's degree in computer science, mathematics, or a related technical field
- 5+ years of relevant employment experience in data engineering or related field
- At least 3 years of SPARK development experience
- At least 1 year experience with Airflow, NiFi, or Azkaban
- Clear understanding of testing methodologies and AWS/GCP cloud Best Practices
- Mastery to big data technologies (e.g. Hadoop, Hive, Spark, EMR)
- Excellence in technical communication and experience working directly with stakeholders
- Experience maintaining data pipelines using big data technologies like Hadoop, Hive, Spark, EMR etc.
- Demonstrated ability to coordinate projects across functional teams, including engineering and product management
- Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
If this looks like you, please get back to me with your updated CV and availability to have quick chat! 😊