DV Cleared Data Engineer

Posted 20 hours 5 minutes ago by IO Associates

Contract
Not Specified
Other
Somerset, Bristol, United Kingdom, BS483
Job Description

DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi

Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site)

Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to take the lead in building robust, Real Time data pipelines in a security-focused environment.

This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors.

What You'll Be Doing

  • Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi

  • Handling Real Time data ingestion and transformation with an emphasis on integrity and availability

  • Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs

  • Monitoring and optimising high-throughput data flows across on-prem and cloud environments

  • Building insightful Kibana dashboards to support Business Intelligence and operational decision-making

  • Maintaining documentation of data flows, architecture, and security procedures to ensure audit-readiness

Your Experience

Must-Have:

  • Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries

  • Proficiency in the full Elastic Stack for data processing, analytics, and visualisation

  • Hands-on expertise with Apache NiFi in designing sophisticated data workflows

  • Solid Scripting capabilities using Python, Bash, or similar

  • Familiarity with best practices in data protection (encryption, anonymisation, access control)

  • Experience managing large-scale, Real Time data pipelines

  • Working knowledge of cloud services (AWS, Azure, GCP), especially around secure deployment

Nice-to-Have:

  • Background in government, defence, or highly regulated sectors

  • Exposure to big data tools like Kafka, Spark, or Hadoop

  • Understanding of containerisation and orchestration (eg Docker, Kubernetes)

  • Familiarity with infrastructure as code tools (eg Terraform, Ansible)

  • Experience building monitoring solutions with Prometheus, Grafana, or ELK

  • Interest in or exposure to machine learning-driven data systems