Tech Lead - Data Engineering
Posted 13 hours 2 minutes ago by Story Terrace Inc.
At Plentific, we're redefining property management in real time. Our mission, is to lead real estate through the transformative journey into "The World of Now," enabling us to empower property professionals through our innovative, cloud-based platform.
We harness cutting-edge technology and data-driven insights to streamline operations for landlords, letting agents, and property managers-enabling them to optimize maintenance, manage repairs, and make informed decisions instantly. Our platform is designed to create seamless, real-time workflows that transform traditional property management into a dynamic, digital experience.
Backed by a world-class group of investors-including Noa, Highland Europe, Brookfields, Mubadala, RXR Digital Ventures, and Target Global-Plentific is at the forefront of the proptech revolution. Headquartered in London with a global outlook, we're continually expanding our reach and impact.
We're looking for forward-thinking, passionate professionals who are ready to contribute to our mission and drive industry innovation. If you're excited about making an immediate impact and shaping the future of property management, explore career opportunities with us at Plentific.
The RoleWe are looking for a Tech Lead - Data Engineering to serve as the primary architect and owner of our data platform. Reporting to the Head of Engineering, you will own the end-to-end technical direction of our data ecosystem and act as the most senior individual contributor in this domain.
This role sits at the intersection of data engineering and system design. You will define how data is ingested, modelled, stored, transformed, and exposed across the company, with an emphasis on robust pipelines, clear data contracts, and reliable operation at scale.
We are looking for someone who goes beyond building pipelines and focuses on designing durable, well-architected data systems. The large volumes of transactional data we generate form the foundation for machine learning and other AI-driven solutions that we are actively building and evolving. Your focus will be on designing and evolving data systems that are reliable, maintainable, and fit for long-term use, applying strong software engineering principles to how data is structured, integrated, and operated at scale.
Responsibilities- Own the Data Platform: Take end-to-end ownership of the data platform, including ingestion, storage, transformation, and exposure layers. This includes setting technical direction and being accountable for system reliability, performance, and cost.
- System Architecture: Lead the design of distributed data systems, ensuring clean integration between backend services, external APIs, event streams, and data storage layers.
- ML-powered Product Enablement: Work closely with product and engineering teams to design and lead data foundations for machine-learning-powered product features, ensuring data quality, traceability, and production readiness.
- Data Modelling & Strategy: Act as the lead architect for data models and contracts. Design schemas for both structured and unstructured data, balancing flexibility, performance, and long-term maintainability.
- Engineering Standards & Artefacts: Set and uphold engineering standards across the data domain. Produce and maintain architecture diagrams, design documents, and Architecture Decision Records (ADRs). Champion best practices including version control, CI/CD, modular design, backwards compatibility, and automated testing.
- Pipeline & ETL/ELT Design: Architect and implement high-scale, fault-tolerant data pipelines. Make deliberate trade-offs around latency, freshness, cost, and complexity, selecting fit-for-purpose tools rather than defaulting to trends.
- Hands-on Delivery: Spend a significant portion of your time building and maintaining core pipelines, schemas, and services in production. This is a hands on role with direct responsibility for critical systems.
- Technical Leadership: Define the technical roadmap for data, perform deep code reviews, and mentor engineers on system design, SQL, and Python.
- Workflow Automation: Design and implement automated workflows (using tools such as n8n or custom Python services) to bridge operational gaps and reduce manual processes.
- Governance & Security: Design enterprise grade governance frameworks covering access control, data lineage, observability, and data integrity.
- Production Ownership: Be accountable for production incidents, data quality issues, and cost regressions within the data platform.
- ML-powered Product Enablement: Work closely with product and engineering teams to design and lead data foundations for machine-learning-powered product features, ensuring data quality, traceability, and production readiness.
- Architectural Mindset: Proven experience as a Tech Lead, Principal Engineer, or System Architect designing and owning complex, distributed systems.
- Strong Software Engineering Foundations: A software engineer first mindset with deep experience in Python and production grade engineering practices. Experience with libraries such as Pandas or Polars is expected, but architectural thinking matters more than specific tools.
- Machine Learning Exposure: Hands on experience working with machine learning systems and tooling (e.g. Hugging Face, feature stores, model inference pipelines, or similar), with an emphasis on enabling ML in production rather than research experimentation.
- Database & Storage Expertise: Advanced SQL skills and hands on experience with modern cloud data warehouses (e.g. Snowflake or equivalent), alongside solutions for unstructured or semi structured data.
- ETL/ELT & Orchestration: Experience designing and operating modern data pipelines using tools such as dbt, Airflow, or equivalent orchestration and transformation frameworks.
- Engineering Rigor: Deep experience with Git based workflows, CI/CD pipelines, automated testing, and maintaining long lived systems in production.
- Engineering Judgement: Demonstrated ability to make and defend trade offs-when to model data, when not to ingest data, and how to balance correctness, performance, and cost.
- Analytical Depth: Ability to interrogate and analyse data directly to validate system behaviour and ensure high levels of data quality.
- Experience with Analytics-as-Code platforms such as Looker/LookML.
- Experience building internal platforms that enable, rather than directly deliver, BI and reporting.
- Experience with automation platforms such as n8n for connecting operational systems.
- Experience designing systems for multimodal data (text, images, video, documents).
As you can see, we are quickly progressing with our ambitious plans and are eager to grow our team of doers to achieve our vision of managing over 2 million properties through our platform across various countries. You can help us shape the future of property management across the globe. Here's what we offer:
- A competitive compensation package
- 25 days annual holiday + 1 additional day for every year served up to 5 years.
- Flexible working environment including the option to work abroad
- Private health care for you and immediate family members with discounted gym membership, optical, dental and private GP
- Enhanced parental leave
- Life insurance (4x salary)
- Employee assistance program
- Company volunteering day and charity salary sacrifice scheme
- Learning management system powered by Udemy
- Referral bonus and charity donation if someone you introduce joins the company
- Season ticket loan, Cycle to work, Electric vehicle and Techscheme programs
- Pension scheme
- Work abroad scheme
- Company-sponsored lunches, dinners and social gatherings
- Fully stocked kitchen with drinks, snacks, fruit, breakfast cereal etc.