Data Engineer - 39 hours

Full-time

Ref: 27551

Hours: Full-time

Contracted Hours: 39

Contract Type: Permanent

Location: Chester House, Epsom Ave, Handforth, Cheadle, Greater Manchester, SK9 3RN

Description

We have recently launched an innovative Petcare Platform and are excited to expand our capabilities by integrating real-time event data with our existing operational and analytical platforms (Microsoft Azure and Google Cloud Platform). We're seeking talented Data Engineers to join our Data Platforms team within the Engineering function. In this role, you’ll collaborate with colleagues across Engineering, Data Science, and Analytics functions to enhance our platform’s capabilities and deliver impactful improvements to our business.

The roles sit within a broad team of Engineers who bring data together from across the business. We are currently transitioning from traditional batch orchestration (Airflow/Composer, Bigquery) into event streaming; this will involve development and deployment of new tools and processes on our analytics platform, with support and guidance from experienced Engineers across the Platform, Data and Analytics Engineering disciplines. In this role, you will get to directly shape the delivery of this exciting new development in our analytics capabilities; the pipelines and data modelling processes you build will change the way the business sees its data, making decisions which directly impact our customers faster than ever before.

As a Data Engineer, you will be responsible for:

  • Implementing and maintaining data flows and pipelines connecting operational systems, analytical and reporting data layers
  • Handling of both batch and streamed data
  • Optimising cloud platform implementation for scalability
  • Producing accurate and concise documentation of processes and code
  • Developing across all stages of the ETL/ELT (extract, transform, load) process
  • Contributing to design decisions around pipelines and platform architecture

Key Responsibilities:

  • Develop, maintain, and improve cloud data platform
  • Maintain relevant technical documentation
  • Deliver software that is scalable, fault tolerant, and highly available, and tested
  • Conduct code reviews, pair programming, and knowledge sharing sessions
  • Drive automation within CICD, infrastructure, management, and configuration
  • Ensure that data is of the highest quality and legality coming into and going out of the cloud data platform

Essential Experience, Knowledge & Expertise

Experience working with cloud based data platforms (hands-on experience with major Cloud provider such as GCP, AWS, Azure)

  • Working knowledge of programming language (Python/Go preferable)
  • Strong development skills (ELT/ETL, batch, streaming)
  • Experience of contributing to shared codebase under version control
  • Familiarity with batch orchestration tools (e.g. Airflow)
  • Familiarity with event-based real-time streaming (e.g. GCP PubSub, Azure Service Bus, Kafka)
  • Data modelling and structures, experience with data laking and warehousing tools and techniques (e.g. SQL, dbt, Big Query, Spanner, Snowflake, Redshift etc.)
  • Comfortable with troubleshooting via logs, as well as building informative logging into pipelines

 

Desirable Experience, Knowledge & Expertise

  • Experience working with Agile methodologies such as scrum and kanban
  • Proven experience of architecture design and data-product development using GCP’s capabilities (GKE, Cloud Run, DataFlow, BigQuery, PubSub). Even better if you’re Google Cloud Certified (Associate or Professional level).
  • Experience with Data Visualization tools such as Tableau, Qlik, Looker, PowerBI etc.
  • Proficiency in multiple programming languages
  • Background working alongside Data Science professionals in ML Ops capacity
  • Use of test-driven development and automated testing.

Role Specific Competencies

  • Product Mindset – View the platform and the datasets we make available as products, and colleagues as customers, with a focus on leveraging the most value for the business
  • Operational Excellence – Data pipelines that follow best practices, with embedded testing, monitoring and fault tolerance that allow us to adhere to SLAs
  • Agile Approach – Embrace iterative development and delivery, with continuous feedback to refine and enhance products
  • Technical Excellence – Understanding of optimised and efficient code, delivering robust data pipelines and following best practices

Why Join Us:

This is a unique opportunity to shape the future of our Petcare Platform and make a tangible impact on our business and customers. You will be working at the forefront of data engineering, leveraging cutting-edge technology to build a scalable and efficient platform. Join us and be part of a team that values innovation, collaboration, and excellence.

 

A few things to note 

  • We do not need agency support; we do all our recruitment in house.
  • We can't offer visa sponsorship or relocation support for this role.

Pets just see people. They aren’t biased and they don’t discriminate. We take our inspiration from pets and we value and respect difference in all its forms. Our aim is to reflect the diversity of the communities we operate in and every colleague can help us achieve this. We encourage our people to be themselves so even if your skills and experience don’t perfectly align, if you think you can make a unique contribution through your values and behaviours, we want to hear from you!

Please note that we may close applications for this role early.

Organisation: Pets at Home

Date Posted: 03-10-2024

Expiry Date: 31-10-2024