Leave us your email address and we'll send you all the new jobs according to your preferences.

Data Engineer

Posted 4 hours 46 minutes ago by WeAreTechWomen

Permanent
Full Time
Factory Jobs
Sussex, Crawley, United Kingdom, RH100
Job Description
Overview

Viridien () is an advanced technology, digital and Earth data company that pushes the boundaries of science for a more prosperous and sustainable future. We discover new insights, innovations, and solutions that address complex natural resource, digital, energy transition and infrastructure challenges.

Job Summary

The Data Engineer plays an important role in the development of our software solution, used by our clients to help them with their complex data transformation challenges. Our system combines the latest ML-based techniques with logic-based transformation, overseen by domain experts, to provide innovative solutions. This role supports the development of the data system focusing on orchestration, resilience and scaling. Additionally, we aim to provide a framework on which our data transformation modules can be developed by a growing team of junior engineers and technical SMEs. The role may also support the implementation of the systems, including deployment and integration with clients' own data stores, processes and workflows.

Team Description

Data Hub is a dynamic team of scientists and developers who solve complex problems. We provide leading edge technology solutions and services to solve our clients' data transformation and analytics challenges across industries including geothermal, environmental, hydrocarbon and mineral exploration. You will be working in an open and collaborative environment with opportunities to learn, grow, and develop. We have an informal team culture and believe work should be fun and rewarding.

You will be based in one of our hub locations (Crawley or Llandudno) and you will be working alongside our teams of data engineers, machine learning engineers, software engineers and subject matter experts. We offer hybrid working and remote working can be considered.

Key Responsibilities
  • Contribute to the development of our data platform infrastructure, including orchestration systems, data processing logic, and interactions between system components.
  • Help develop a flexible framework for data transformations by creating a modular system where new transformation logic can be easily developed and integrated into our product offering.
  • Build robust data pipelines with a focus on dynamic, end-to-end, metadata-driven solutions that consider downstream application/UI data access patterns, maintainability, monitoring, access control, and related aspects.
  • Influence architecture and technology choices. Communicate design ideas and solutions clearly through architectural diagrams and documentation to technical and non-technical stakeholders.
  • Maintain awareness of best practices in software and data engineering, writing secure, performant, and maintainable code (Python, SQL). Minimise technical debt and optimise performance where it matters.
  • Partner with data analysts, data scientists, and other end-users to understand requirements and ensure the platform and its data are accessible, reliable, and meet project delivery needs.
  • Share work and best practices; collaborate with others; ensure what we build and how we build it aligns to our ambition for growth.
Qualifications and Experience

Required:

  • Experience designing, building and maintaining data transformations in a system or product setting.
  • Ability to write secure and performant code in Python and SQL, and optimise queries and data pipelines.
  • Significant experience using orchestrators and ETL tools, especially Airflow.
  • Significant RDBMS experience (PostgreSQL, Oracle). Experience with NoSQL databases (e.g., Neo4j, Elastic) or vector databases is beneficial.
  • Data architecture experience relating to data modelling, data warehousing and schema design (3NF, dimensional modelling, medallion architecture).
  • Experience using Docker, VCS (Git, GitLab) and knowledge of CI/CD.
  • Enthusiastic attitude towards learning and flexibility to adapt to new challenges or changes in direction.

Preferred:

  • Knowledge of DevOps and DataOps best practices.
  • Kubernetes deployment experience.
  • Microsoft Azure and cloud-native data technologies (e.g., Azure Data Factory, Databricks).
  • RESTful API / GraphQL experience.
  • Infrastructure as Code.
  • Previous experience building web applications with knowledge of web frameworks, HTTP, networking, security, etc.
Benefits Package
  • Highly attractive bonus scheme.
  • Initial 22 days annual leave with future increases, flexible holiday buying/selling program.
  • Company contributory pension plan.
  • Flexible Private Medical & Dental care tailored to individual or family needs.
  • Employee Assistance Program to support staff.
We Care about our Staff and Environment

We recognise the importance of work-life balance with flexible working and relaxed dress code policies. We support staff wellbeing through various initiatives, including social club events, rewards, and comprehensive benefits such as gym discounts, cycle schemes, and entertainment tickets. We are committed to protecting the environment through sustainable solutions, energy saving and waste reduction enterprises.

Our Hiring Process

We are committed to a respectful, inclusive, and transparent recruitment experience. Due to high application volumes, we may not provide individual feedback to every applicant. Candidates whose qualifications closely match the role criteria will be contacted for an interview. We aim to share personalized feedback with those who progress to the first round and beyond. If you require reasonable adjustments to participate in the application or interview stages, contact the recruiter directly. We value diversity and are committed to equal employment opportunities for all professionals.

Email this Job