Lead SRE - Data Engineering





Project description:

  • As part of the customer's data organisation, the data engineering team’s mission is to build the technical foundation that will support the whole organisation to produce and consume data from a new, central data lake, to supply data to external parties, to connect data to BI and analytics tools used by business teams and to bring complex and state-of-the-art machine learning models in production.
  • The team is fully cloud-native, working with Google Cloud Platform products like BigQuery, Cloud Dataflow, Cloud Dataproc, Kubernetes Engine, Cloud Bigtable, Compute Engine,...
  • Because the team only came into existence recently, there is currently no proper SRE support. As lead SRE, you would be responsible of setting up an SRE team within data engineering, drafting and defining the boundaries and synergies between SRE and the data engineers and as such, helping the team deploying more reliable and performant services.


  • Setup a new SRE team for data engineering.
  • Define the scope of SRE within the technical landscape of the data engineering team and put in place the correct collaboration between SRE and engineers
  • Implement tools and processes for deployment of data microservices (on kubernetes) and pipelines (CI/CD, canary releases, rollbacks,...)
  • Automate provisioning of a resilient data infrastructure (pubsub topics and subscriptions, bigquery datasets and tables, dataflow pipelines,...)
  • Work with data engineers to facilitate regular releases
  • Maintain services in operational conditions, analyse and resolve performance and scalability anomalies (load tests) of current and historical deployments
  • Supervise and monitor the data infrastructure in collaboration with the Monitoring Operations Center (MOC), manage access policies and security
  • Being the evangelist of SRE good practices and participate in the construction of a true transversal SRE community

Technical skills:

  • Expert in cloud environments. Preferably Google Cloud Platform (GCP). In case of AWS or Azure experience, strong willingness to learn GCP is a prerequisite
  • Good knowledge of UNIX command line
  • IaC: packer, terraform,...Container: docker, kubernetes
  • Experience with streaming technologies like Apache Beam (on google dataflow), kafka streams, spark streaming,... is a big plus
  • Languages: Java, Python, SQL, bash. Knowledge of Scala and/or go is a plus
  • Familiar with different database technologies/concepts: relational (MS-SQL, PostgreSQL,...) and non-relational (Bigtable/HBase, Elasticsearch,...)
  • Experience with message busses like Kafka, RabbitMQ or ActiveMQ is a plus

Contact person:

Contact name: Recruitment IT Staffing