Featured

Big Data Platform Admin

Posted 09 March 2023
Salary €500 - €700 per day + Negotiable, reach out :)
LocationZurich
Job type Contract
Discipline Data
ReferenceBBBH80281_1678368741
Contact NameEnrique Cabanas Rodriguez

Job description

Big Data Platform Admin

Start Date: April 2023, or 4 week notice period.

Contract Length: 6 Months + Extension

Location: Zurich, Switzerland. 3 days on-site 2 days remote

Pay: 500 - 700 CHF per day

I am working with a leading global consultancy to find an experienced Big Data Platform Admin to join one of the leaders in the banking and financial services industry.

Ideally, we are looking for someone that would be able to start this project in April 2023, however, we can accommodate a four/six-week notice period.

Core responsibilities:

  1. Infrastructure & technical product engineering, integration and management
  2. Client multi-tenancy management and technical support
  3. Lifecycle (EoL, upgrade) management and coordination

Working experience:

  • Experience in Bigdata Platform Management or working in Bigdata Solution Architecture.
  • Minimum 1 years of experience in handling CDH to CDP migration.
  • Expertize on Big data lake design, multi tenancy framework setup and implementation.
  • Experience with Cloudera Hadoop components such as HDFS, HBase, Impala, Hue, Spark, Sentry, Ranger, Hive, Kafka, YARN and Zookeeper
  • Expereince in Multi-tenancy architecture and implementation (in DEV and Admin perspective)
  • Expert knowledge on the capabilities of YARN/Impala resource management
  • Take end-to-end responsibility of the Hadoop life Cycle in the organization.
  • Detect, analyze and remediate performance problems.
  • Experience with integration of data from multiple data sources, onboarding applications on Hadoop Platform.
  • Experience in at least one of the following: Python, Java or Unix, Shell Scripting and eager to pick up new programming languages on the go
  • Ability to automate manual mundane tasks through scripting and building automation solution.
  • The ability to function within a multidisciplinary, global team. Be a self-starter with a strong curiosity for extracting knowledge from data and the ability to elicit technical requirements from a non-technical audience
  • Data Concepts (ETL, near-/real-time streaming, data structures, metadata and workflow management)
  • You have deep understanding of Dev ops and Agile software development methodologies

Nice to have

  • Python, Scala, Java,Kafka, CSS, Performance Engineering Tuning skills (Impala, Spark, Hive), Azure Cloud