Data Architect

Posted 8 hours 48 minutes ago by WebLife Labs

Permanent
Full Time
Other
Not Specified, United Kingdom
Job Description
About the Role

As the founding Data Architect for our new venture, you will define and build our entire data ecosystem from the ground up. You will design, implement, and operate the production-grade infrastructure that becomes the backbone of our AI-native SaaS product. Your mission is to turn complex business requirements into scalable data solutions, from sophisticated multi-tenant cloud architectures and robust ELT pipelines to observable workflows that power our AI systems. Partnering closely with product and AI engineering teams, you will own the full lifecycle of the data platform, making a direct and visible impact on what we deliver to clients. This is a remote role and a rare opportunity to shape a global product at its earliest stage while working hands-on as part of a small, high-performing team.

Key Responsibilities Data Platform Architecture and Engineering
  • Architect and implement a secure, multi-tenant data platform using cloud technologies to support SaaS delivery models
  • Design and build scalable ETL/ELT pipelines integrating diverse sources including e-commerce platforms, CRM systems, marketing tools, and APIs
  • Optimize data pipelines and storage layers to ensure scalability, high availability, and consistent performance across multiple tenants
  • Develop and optimize data models to support analytics, reporting, and machine learning with a focus on performance, scalability, and cost efficiency
Platform Operations and DevOps
  • Implement Infrastructure as Code using tools like Terraform and AWS CloudFormation to automate provisioning and scaling of data platforms
  • Enhance platform reliability and performance by applying observability practices including monitoring, logging, and alerting with appropriate tooling
  • Design and manage CI/CD pipelines for data applications, ensuring automated testing, version control, smooth deployments, and rollback strategies
  • Define and enforce enterprise-wide data architecture standards to ensure scalability, interoperability, and alignment with long-term business objectives
Security, Compliance, and Governance
  • Ensure compliance, governance, and security through proper IAM role design, data encryption, and alignment with standards including GDPR, SOC 2, and ISO frameworks
  • Implement data partitioning strategies, tenant isolation protocols, and cost-efficient scaling mechanisms for multi-tenant environments
  • Design and support SaaS observability practices covering SLA monitoring, usage metering, and compliance adherence
Collaboration and Leadership
  • Collaborate with Data Analysts, Data Scientists, AI Engineers, and Business Stakeholders to translate product requirements into scalable cloud-based solutions
  • Mentor data engineers and future team members as the organization grows
  • Proactively troubleshoot and resolve platform issues, implementing root-cause analysis and continuous improvement processes
  • Stay current with emerging cloud, data engineering, and platform technologies, recommending improvements and adopting new solutions
Requirements Education & Experience
  • Bachelor's degree in Computer Science, Data Engineering, Software Engineering, or a related technical field; Master's degree preferred
  • 5-7 years of experience in Data Engineering, Platform Engineering, or Cloud-related roles, with 3+ years in architecture or platform ownership
  • Proven experience with international clients, particularly in US, UK, European, or Australian markets, strongly preferred
Technical Expertise
  • Deep expertise in AWS and GCP cloud services, covering storage, compute, data processing, analytics, and serverless platforms (e.g., AWS S3/Redshift/Glue/Lambda, GCP BigQuery/Cloud Storage/Dataflow/Cloud Functions)
  • Strong proficiency in SQL, Python, and data modeling for analytical and operational use cases
  • Hands-on experience with production-grade ETL/ELT frameworks and workflow orchestration (e.g., Airflow, dbt, Talend, AWS Glue, GCP Dataform/Cloud Composer)
  • Proven ability to design, deploy, and optimize data warehouses and lakehouse architectures using technologies like BigQuery, Redshift, Snowflake, and Databricks
  • Experience with Infrastructure as Code tools (e.g., Terraform, AWS CloudFormation, GCP Deployment Manager) for cloud resource provisioning and management
  • Proficiency with CI/CD pipelines and DevOps practices for data applications, including Git-based workflows and containerization/orchestration using Docker, ECS, GKE, or Kubernetes
Platform and Architecture Skills
  • Solid understanding of platform engineering concepts including observability, monitoring tools (CloudWatch, Prometheus, Grafana), and automated scaling
  • Proven experience designing and supporting multi-tenant SaaS data platforms with strategies for data partitioning, tenant isolation, and cost management
  • Exposure to real-time data processing technologies such as Kafka, Kinesis, Flink, or Spark Streaming, alongside batch processing capabilities
  • Strong knowledge of SaaS compliance practices and security frameworks
Core Competencies
  • Excellent problem-solving abilities with the capacity to translate requirements into production-grade, maintainable, and observable systems
  • Strong communication and collaboration skills for leading cross-functional technical discussions
  • Ability to mentor junior engineers and provide technical leadership
  • Adaptability to new tools, frameworks, and emerging technologies in the data ecosystem

• Competitive USD-based compensation packages
• A flexible, work-from-home setup as part of a globally connected team
• AI-driven, innovation-focused projects
• A culture that encourages curiosity and lifelong learning
• Clear career paths and personal development opportunities

Join us to build a global product from the ground up and redefine how businesses run with AI

Apply for the role

We'd love to hear from you. Please fill out this form.