Closing soon
Closing soon

We are looking for a Junior Data Lake Developer, who will be responsible for implementing the changes and improvements required within the Data Lake solution.

The role

You will be delivering items from large scale changes linked to Business transformative programmes to minor improvements requested by a single user. You will liaise with other departments to ensure a high quality of work, whilst always looking to improve the performance of the Data Lake and Data Warehouse, to satisfy the ever-changing demands of the Business.

The day-to-day work is interesting, challenging and fast paced amidst a hardworking and delivery focused Company ethos. We hire people with a broad set of technical skills who are ready to tackle some of technology’s greatest challenges.

This role is eligible for inclusion in the Company’s hybrid working from home policy.

Preferred skills and experience

  • Commercial or academic exposure of the GCP product suite, or other cloud providers.
  • Relational set-based processing through SQL queries.
  • Commercial or academic experience of Python scripting.
  • Ability to work as part of a team of highly technical developers.
  • Ability to work in a continually changing and reactive environment.
  • Committed and flexible with a can-do attitude towards work.
  • Keen attention to detail.
  • Ability to work to deadlines.
  • Excellent communication skills.

Main responsibilities

  • Maintaining the GCP environment including BigQuery, Analytics Hub and Big Lake.
  • Developing load processes into the Data Lake.
  • Developing SQL and Python code to process data into the Data Lake.
  • Being actively involved in the development of processes and standards of the GCP products used.
  • Being involved in the ongoing evolution of departmental standards and enforcing the adherence to the development process.
  • Creating and maintaining all relevant documentation.