Skip to main content

SIEM Audit Sinks

Arcan can stream audit events to external SIEM platforms in real-time. Every secret access, policy change, and auth event is forwarded to one or more configured sinks alongside the built-in audit log.

Architecture

All sinks implement a generic Sink interface (Send, Name, Close). A Dispatcher fans out each audit event to all registered sinks asynchronously. Failed deliveries are retried with exponential backoff (1s, 2s, 4s) for HTTP-based sinks.

Audit Event
└── Dispatcher
├── Built-in store (SQLite/PostgreSQL)
├── Sink: Splunk
├── Sink: Datadog
└── Sink: File (for log shippers)

Supported Platforms

SinkProtocolAuth Method
SplunkHTTPS (HEC)HEC token
Microsoft SentinelHTTPS (Data Collector API)Workspace ID + shared key
ElasticHTTPS (Bulk API)API key
CrowdStrike Falcon LogScaleHTTPS (HEC-compatible)Ingest token
Palo Alto Cortex XSIAMSyslog (TCP/TLS, CEF)TLS certificate
DatadogHTTPS (Log Intake)API key
Google ChronicleHTTPS (Ingestion API)OAuth token
SyslogTCP/UDP/TLS (RFC 5424)None / TLS
WebhookHTTPS (POST)Bearer token / custom headers
FileLocal filesystem (JSON-lines)N/A

Configuration

Add an audit.sinks section to your config.yaml:

audit:
sinks:
- type: splunk
enabled: true
endpoint: "https://splunk.example.com:8088"
token: "your-hec-token"
options:
index: security
sourcetype: "arcan:audit"
source: arcan

Sinks with enabled: false are ignored. Multiple sinks can run simultaneously.

Platform Setup

Splunk

Uses the HTTP Event Collector (HEC). Create an HEC token in Splunk under Settings > Data Inputs > HTTP Event Collector.

- type: splunk
enabled: true
endpoint: "https://splunk.example.com:8088"
token: "your-hec-token"
options:
index: security
sourcetype: "arcan:audit"
source: arcan

Microsoft Sentinel

Uses the Log Analytics Data Collector API. Get the workspace ID and shared key from Azure Portal > Log Analytics workspace > Agents.

- type: sentinel
enabled: true
endpoint: "" # auto-derived from workspace_id
token: "base64-encoded-shared-key"
options:
workspace_id: "your-workspace-id"
log_type: ArcanAudit

The token field must contain the base64-encoded shared key (as shown in the Azure portal).

Elastic

Uses the Elasticsearch Bulk API. Create an API key in Kibana under Stack Management > API Keys.

- type: elastic
enabled: true
endpoint: "https://elastic.example.com:9200"
token: "your-api-key"
options:
index: arcan-audit

CrowdStrike Falcon LogScale

Uses a HEC-compatible ingestion endpoint. Create an ingest token in LogScale under Settings > Ingest tokens.

- type: crowdstrike
enabled: true
endpoint: "https://cloud.community.humio.com"
token: "your-ingest-token"
options:
repository: arcan

Palo Alto Cortex XSIAM

Sends CEF-formatted syslog messages over TCP or TLS.

- type: cortex
enabled: true
endpoint: "cortex-syslog.example.com:514"
options:
facility: local0
tls: "true"

Datadog

Uses the HTTP log intake API. Get your API key from Datadog under Organization Settings > API Keys.

- type: datadog
enabled: true
token: "your-dd-api-key"
options:
site: datadoghq.com
tags: "env:prod,service:arcan"

Google Chronicle

Uses the Ingestion API with OAuth authentication. Create a service account with Chronicle Ingestion permissions.

- type: chronicle
enabled: true
endpoint: "https://malachiteingestion-pa.googleapis.com"
token: "oauth-service-account-token"
options:
customer_id: "your-chronicle-customer-id"

Syslog

Standard RFC 5424 syslog over TCP, UDP, or TLS.

- type: syslog
enabled: true
endpoint: "syslog.example.com:514"
options:
protocol: tcp
facility: auth
format: rfc5424
tls: "false"

Webhook

Generic HTTP POST to any endpoint. Works with Slack, PagerDuty, custom APIs, etc.

- type: webhook
enabled: true
endpoint: "https://hooks.example.com/audit"
token: "bearer-token"
options:
method: POST
headers: "X-Source=arcan,X-Env=prod"

File

JSON-lines output for consumption by log shippers (Fluentd, Filebeat, Vector).

- type: file
enabled: true
options:
path: /var/log/arcan/audit.jsonl
rotate: daily
max_size: "104857600"
max_files: "10"

Multiple Sinks

Enable multiple sinks to forward events to several platforms simultaneously:

audit:
sinks:
- type: splunk
enabled: true
endpoint: "https://splunk.example.com:8088"
token: "hec-token"
options:
index: security

- type: file
enabled: true
options:
path: /var/log/arcan/audit.jsonl
rotate: daily

- type: webhook
enabled: false # disabled — won't receive events
endpoint: "https://hooks.example.com/audit"

Troubleshooting

Sink fails to initialize: Check server logs at startup. Missing required fields (endpoint, token, workspace_id) produce clear error messages with the sink name and missing field.

Events not arriving: Verify enabled: true in your config. Check that the endpoint is reachable from the Arcan server. HTTP sinks retry 3 times with exponential backoff before dropping an event.

TLS errors: For self-signed certificates on sink endpoints, the HTTP client uses the system trust store. Add the CA certificate to the system trust store or use the file sink as a local buffer and ship with a log agent that handles TLS.

High latency: Sinks send events asynchronously via the dispatcher. Slow sinks do not block request processing. If a sink is consistently slow, check network connectivity and consider using the file sink with a log shipper instead.