Skip to main content

Architectural Overview: Data Sovereignty & Pipeline Security

Learn how Defense.com can deliver data sovereign security services at scale.

Daniel Sampson avatar
Written by Daniel Sampson
Updated over a week ago

Our architectural approach is designed to provide high-performance threat detection and response while maintaining strict adherence to global data residency and sovereignty requirements. By utilising a tiered data pipeline, the solution ensures that sensitive information remains under regional control while benefiting from global threat intelligence at scale.

The platform utilises a distributed architecture optimised for massive scale. This structure decouples data ingestion from long-term storage, allowing for regional isolation of sensitive logs.

Localised Ingestion & Edge Collection

The pipeline begins at the customer site with Customer Collectors (local VMs or agents).

  • Customer Control: This layer grants organisations granular control over what data is collected and filtered before it ever leaves the corporate perimeter.

  • Secure Transit: All data is encrypted in transit via TLS-encrypted tunnels to an In-Region Edge Collector. This edge collector typically resides within a sovereign cloud provider or local data centre, ensuring that the primary data processing remains within the specific geopolitical jurisdiction.

The Cyber Fusion Engine (High-Performance Detection Layer)

The regional collectors forward processed data to the Cyber Fusion Engine. This is a high-performance cluster optimised for real-time analysis and correlation.

  • The 90-Day Retention Window: Logs are retained in this active layer for less than 90 days to allow for parsing, detecting threats and enrichment with threat intelligence. This window is dedicated to "hot" analysis hunting focused on active threats and correlating patterns across a broader telemetry landscape.

  • Collective Intelligence: While individual logs remain isolated, the Fusion Engine applies global threat intelligence and detection logic. This allows a threat identified in one sector to automatically harden the defences for all customers in the ecosystem.

  • Data Scrubbing: During this phase, automated sensitive data masking and PII (Personally Identifiable Information) removal are applied if detected, ensuring the central fusion layer remains compliant with privacy mandates.

Long-Term Forensic Archiving

For compliance and historical investigations, data is offboarded from the fusion stack into a Long-Term Archive.

  • Sovereign Storage: This archive is stored as forensic objects (utilising highly durable S3-compatible object storage) within a local cloud region.

  • Forensic Integrity: Logs are kept for a minimum of one year in a compressed, immutable format. This ensures that while the "active" data in the Fusion Engine is cycled out, the historical record remains accessible within the customer's required geographic borders for audit and legal discovery.

High-Level Architectural Design

A screenshot of a computer

AI-generated content may be incorrect.

Did this answer your question?