- Duration: Initial 6 months
- Location: Remote
- Rate: £500 - £600 Outside IR35
This role focuses on accelerating the development of the Data Platform by contributing to the cross-functional engineering teams within the Wealth business unit. The primary responsibilities involve technical development, ensuring the quality of data solutions, and collaborating with team members.
As part of the engineering team, you will work with engineering team leads, solution architects, software engineers, quality engineers, and product members to contribute to the design, development, delivery, and operation of UK Wealth data products.
Strong knowledge and recent experience in many of the following data technologies and
methodologies:
- Data Architecture Principles: Data lakes, data warehouses, ETL/ELT processes (knowledge Debezium advantageous), data modelling, and data governance.
- Programming Languages: Python (for data manipulation, analysis, and scripting), SQL (for database interaction and data querying), Pyspark, Kafka and potentially Java or Scala for big data processing.
- Database Technologies: Experience with relational databases like MS SQL Server and PostgreSQL, as well as NoSQL databases (e.g. DynamoDB).
- Data Intelligence Platforms: Working knowledge of enterprise DI platforms, specifically
- Databricks, including Data ingestion, streaming, transformation, data integration patterns and tools.
- Data Integration and Pipelines: Building, maintaining, and optimising data pipelines using tools and frameworks designed for efficient data movement and transformation.
- Cloud Data Platforms: Familiarity with cloud services for data engineering, such as AWS (e.g., S3, Redshift, Glue).
- Data Testing and Quality: Implementing data quality checks, data validation, and unit testing for data pipelines to ensure data accuracy and reliability.
- Scripting and Automation: Using Bash, PowerShell, or Python scripts to automate data-related tasks and workflows.
- Version Control: Proficient with Git and GitHub for managing code and data-related configurations.
- Build and CI/CD for Data: Understanding and applying CI/CD principles for data pipelines and deployments, including familiarity with tools like Jenkins, GitLab CI, or similar platforms.
- Data Security: Awareness of data security practices, including access control, data encryption and compliance with data privacy regulations.
- Data for AI products/solutions: Knowledge of the use of AI in the context of data. Practical
- experience of AI and AI tooling is beneficial but not required.
- Strong interpersonal skills with the ability to communicate effectively at all levels
- Analytical thinker with a logical approach to problem-solving and solutions
- Flexible in approach and mindset to adapt to changing priorities and requirements
- Ability to thrive under pressure in a fast-paced environment, able to prioritise tasks and manage your own time appropriately
- Able to work both independently and collaboratively within a team environment to achieve team goals and objectives