DevOps Data Engineer – HIRING ASAP

Start date: ASAP
Duration: 3 months initially with a view to extend
Location: 1 day per week in Birmingham & 4 days per week remote working.
Rate: £375 - £393 per day inside ir35
 
Key Skills
  • Strong DevOps engineering backgroundCICD pipeline
  • Git / GitHub
  • Jenkins
  • SonarQube
  • Linux, e.g. red hat
  • Groovy / Bash / Python scripts
  • Good understanding of Python library and application development
    • Dependency management and package management, e.g. Poetry, pip
    • Pandas and numpy
    • API, e.g. Flask, Dash
    • IDE (Pycharm, VsCode) and remote development
  • Good understanding of Data engineering
    • ETL / ELT pipeline
    • Spark
    • Airflow
    • SQL / no-SQL
    • Delta Lake
    • Parquet
    • Avro
    • Partitioning
    • Starburst
    • S3 bucket
    • Postgres
    • MLFlow
  • Experience in Cloud and Containerization
    • GCP / Internal cloud
    • Docker / Kubernetes
    • Argo CD
    • Monitoring tooling, e.g. ELK
    • Service Mesh, e.g. Istio
    • Experience in secrets management, e.g. Vault.
  • Good understanding of networking, security, and operating system
    • TCP/IP
    • DNS
    • SSH
    • SSL/TLS
    • Encryption and tokenization
    • CPU and memory management
  • Experience of supporting Big Data infrastructure
    • Hadoop cluster
    • Spark cluster
    • JDK/JVM
    • Cloudera
  • Strong problem solving and troubleshooting skills.
  • Take ownership and able to work independently.
  • Able to work under high pressure and in fast-paced environment.
  • Good at documentation and knowledge sharing
  • Able to communicate and work with multi-region / multi-cultural teams.