Senior Python Developer – ON-SITE

Job Overview

logoPrincipal GCP Data EngineerRemote (Must be UK based)Excellent remuneration + bonus
I’m currently working with a SaaS product organisation who are expanding their data engineering team with
a key senior hire, working directly with the CDO.
The role is remote with one day a month in their London office, although you can go in more if you wish!

Responsibilities
Data Ownership: Drive the design, development, and evolution of data models across multiple domains and clients. Serve as an expert in the SaaS platform, ensuring consistency and best practices in
all data-related aspects.Technical Leadership: Lead and mentor other data engineers, fostering technical excellence and effective use of data platforms. Establish and promote best practices for data engineering within the
organisation.Client Engagement: Collaborate with client delivery teams to optimise platform usage and drive successful outcomes. Support critical phases like discovery, scoping, and problem resolution.Training and Documentation:
Develop comprehensive training programs and documentation to scale knowledge sharing within the company. Ensure training materials and guides are up-to-date and accessible to stakeholders.Strategic Collaboration: Partner
with the Chief Data Officer (CDO), Chief Technology Officer (CTO), Product Director, Head of Product Architecture, and other teams to align on strategy and execution. Contribute to the evolution of the product to support
business objectives.
Required Skills
7+ years experience in Data Engineering/Architecture (or similar).Extensive experience in data engineering, including designing and managing complex data models and solutions.Proven
leadership and mentoring abilities within technical teams.Strong communication skills, with the ability to engage effectively with clients and stakeholders at all levels.Hands-on expertise in modern data platforms, tools,
and technologies, such as: – Advanced data modelling – operational and analytical – Python, SQL – Databricks, Spark – Orchestration frameworks such as Dataform, Airflow, and GCP Workflows. – Modern architecture and cloud
platforms (GCP, AWS, Azure) – DevOps practices – Data warehouse and data lake design and implementation – Familiarity with containerization and IaC tools (e.g. Docker, Kubernetes, Terraform). – Worked within an agile environment
(Scrum, Kanban, etc).Previous experience in the retail domain is a significant advantage.Experience in developing training materials and fostering knowledge sharing.A strategic mindset focused on scalability, performance,
and innovation.

Job Detail
Shortlist Never pay anyone for job application test or interview.