Data Sources

Connect anything to TAK

tak-cot-sender deliberately leaves data ingestion to the rich ecosystem of DuckDB community extensions. This page documents the extensions that pair most naturally β€” each one lets you pull live data and pipe it through tak_send() with minimal SQL.

πŸ’‘

Install community extensions with INSTALL 'extension_name' FROM community; then LOAD 'extension_name';. No recompilation of tak-cot-sender needed.

Real-time polling

HTTP & scheduled feeds

Pull data from REST APIs or any HTTP endpoint on a schedule and forward each record as a CoT event.

🌐

http_client

duckdb.org/community_extensions

Perform HTTP GET/POST requests from SQL. Ideal for one-shot or ad-hoc API polling β€” fetch a JSON or CSV endpoint and process it inline.

SQL
INSTALL 'http_client' FROM community;
LOAD 'http_client';

SELECT tak_send(
  cot_event(id, 'a-f-G-U-C', lat, lon, alt,
    9999999, 9999999, now(), now(),
    now() + INTERVAL '5 min')
)
FROM read_json_auto(
  http_get('https://api.example.mil/units').body
);
⏱️

cronjob

duckdb.org/community_extensions

Schedule any SQL statement with cron syntax. Combine with http_client to build a recurring pipeline that wakes up, fetches data, and pushes CoT events automatically.

SQL
INSTALL 'cronjob' FROM community;
LOAD 'cronjob';

-- Every 15 seconds, push all active units
SELECT cronjob_create(
  'units_to_tak', '*/15 * * * * *',
  $$
  SELECT tak_send(cot_event(
    unit_id, cot_type('a','f','G-U-C'),
    lat, lon, 0, 9999999, 9999999,
    now(), now(), now() + INTERVAL '30s'
  )) FROM active_units;
  $$
);
Streaming & real-time

Event streams & radio feeds

Subscribe to real-time data streams including radio telemetry, message queues, and event-driven sources.

🌊

tributary

query.farm

Real-time streaming extension from query.farm. Subscribe to change data capture (CDC) streams, Kafka topics, or event queues and process records as they arrive.

SQL
LOAD 'tributary';

-- Stream from a live topic, send each event
SELECT tak_send(
  cot_event(
    device_id, device_type,
    payload.lat, payload.lon, 0,
    9999999, 9999999,
    event_time, event_time,
    event_time + INTERVAL '2 min'
  )
)
FROM tributary_stream('sensor_positions');
πŸ“»

radio

query.farm

Ingest RF and radio data directly into DuckDB. APRS, ADS-B, AIS, and other radio protocols can feed position data straight into a CoT pipeline without intermediate processing.

SQL
LOAD 'radio';

-- ADS-B aircraft positions β†’ TAK
SELECT tak_send(
  cot_event(
    icao_hex,
    cot_type('a', 'n', 'A'),  -- neutral air
    lat, lon, altitude_m,
    9999999, 9999999,
    now(), now(),
    now() + INTERVAL '60s'
  )
)
FROM adsb_decode('rtl://0');
Remote filesystems

Cloud, WebDAV & SSH sources

Mount remote filesystems as DuckDB virtual paths and query files as if they were local β€” parquet, CSV, JSON, GeoJSON, everything.

☁️

webdavfs

duckdb.org/community_extensions

Mount WebDAV endpoints as a virtual filesystem. Query files from SharePoint, Nextcloud, or any WebDAV-compatible server with standard DuckDB file paths.

SQL
LOAD 'webdavfs';

SELECT tak_send(
  cot_event(id, type_code, lat, lon, 0,
    9999999, 9999999,
    now(), now(), now() + INTERVAL '5 min')
)
FROM read_csv_auto(
  'webdav://sharepoint.example.mil/Shared/units.csv'
);

Read files from remote servers over SSH/SFTP. Great for mission systems that push logs or position reports to a known remote host without an API.

SQL
LOAD 'sshfs';

SELECT tak_send(
  cot_event(uid, 'a-f-G-U-C', lat, lon, 0,
    9999999, 9999999,
    ts, ts, ts + INTERVAL '5 min')
)
FROM read_parquet(
  'ssh://user@sensor-host.mil/data/positions.parquet'
)
WHERE ts > now() - INTERVAL '1 minute';
HTTP & web scraping

Request & crawler extensions

πŸ“‘

http_request

duckdb.org/community_extensions

Make full HTTP requests (GET, POST, PUT) with custom headers and body from SQL. Supports Bearer tokens for authenticated API calls to protected position data services.

SQL
LOAD 'http_request';

-- Authenticated API call
SELECT tak_send(
  cot_event(id, 'a-f-G', lat, lon, 0,
    9999999, 9999999,
    now(), now(), now() + INTERVAL '5 min')
)
FROM read_json_auto(
  http_request(
    'GET', 'https://api.example.mil/v2/positions',
    headers => {'Authorization': 'Bearer TOKEN'}
  ).body
);
πŸ•·οΈ

crawler

duckdb.org/community_extensions

Crawl websites and extract structured data from HTML tables or JSON-LD. Useful for scraping public data feeds (NOTAM tables, weather sites, open AIS endpoints) and forwarding to TAK.

SQL
LOAD 'crawler';

-- Crawl a public vessel tracking page
SELECT tak_send(
  cot_event(
    mmsi, cot_type('a','n','S-X-M'),
    lat, lon, 0, 9999999, 9999999,
    now(), now(), now() + INTERVAL '10 min')
)
FROM crawl('https://ais.example.org/vessels')
WHERE type = 'cargo';
Geospatial

Spatial & feature service extensions

Translate vector layers, feature services, and geometry data into TAK-ready CoT events using tak-cot-sender's GIS Translator.

πŸ—ΊοΈ

spatial

duckdb.org/docs/extensions/spatial

DuckDB's official spatial extension provides GEOMETRY types, ST_Read() for dozens of vector formats (Shapefile, GeoPackage, GeoJSON, FlatGeobuf), and spatial operations like centroid, buffer, and intersection.

SQL
LOAD 'spatial';

-- Publish all features from a GeoPackage
SELECT *
FROM tak_send_features(
  $$
  SELECT geom, asset_id AS uid,
         cot_symbol    AS type
  FROM ST_Read('assets.gpkg')
  WHERE status = 'active'
  $$,
  'uid', 'type'
);

-- Send polygon centroids individually
SELECT tak_send(
  cot_from_geometry(geom, id, 'a-f-G')
)
FROM ST_Read('zones.geojson');
🌐

duckdb_featureserv

github.com/tobilg/duckdb_featureserv

Expose DuckDB tables as OGC API Features endpoints. Combined with tak-cot-sender, you can host a geospatial REST API and forward those same features to TAK Server from the same DuckDB process.

SQL
-- Read from a remote featureserv endpoint
SELECT tak_send(
  cot_from_geometry(
    ST_GeomFromGeoJSON(geometry),
    properties.id,
    properties.cot_type
  )
)
FROM read_json_auto(
  'https://featureserv.mil/api/features/vehicles'
)
WHERE properties.updated_at > now() - INTERVAL '2 min';
Serve & receive

HTTP Server β€” push events in

Flip the architecture: expose DuckDB as an HTTP endpoint and let external systems push events to you, which you then forward to TAK.

πŸ–₯️

httpserver

duckdb.org/community_extensions

Run an HTTP server inside DuckDB. Define endpoints that accept JSON payloads, insert them into a table, and a background cronjob drains the queue to TAK. Effectively turns DuckDB into a lightweight CoT gateway.

SQLDefine ingest endpoint
LOAD 'httpserver';

-- Accept position reports via POST
SELECT httpserver_start('0.0.0.0', 9999);

CREATE TABLE inbound_positions (
  uid    TEXT,
  lat    DOUBLE,
  lon    DOUBLE,
  alt    DOUBLE,
  ts     TIMESTAMPTZ
);

SELECT httpserver_endpoint(
  '/position',
  $$INSERT INTO inbound_positions
    SELECT * FROM read_json_auto(:body)$$
);
SQLDrain queue β†’ TAK
-- Drain queue every 5 seconds β†’ TAK
SELECT cronjob_create(
  'drain_to_tak', '*/5 * * * * *',
  $$
  SELECT tak_send(cot_event(
    uid, 'a-f-G-U-C',
    lat, lon, alt,
    9999999, 9999999,
    ts, ts,
    ts + INTERVAL '5 min'
  ))
  FROM inbound_positions;

  DELETE FROM inbound_positions;
  $$
);
Quick reference

Extension compatibility matrix

Extension Source type Install Pairs well with
http_clientREST API (one-shot)FROM communityread_json_auto, read_csv_auto
cronjobScheduled executionFROM communityhttp_client, read_json_auto
tributaryCDC / event streamsFROM communitytak_send, cot_event
radioADS-B, APRS, AIS RFFROM communitycot_type, tak_send
webdavfsWebDAV / cloud filesFROM communityread_parquet, read_csv_auto
sshfsSSH / SFTP remotesFROM communityread_parquet, read_json_auto
http_requestAuthenticated HTTPFROM communityBearer tokens, POST bodies
crawlerWeb scrapingFROM communityHTML tables, JSON-LD
spatialGIS / vector formatsFROM corecot_from_geometry, tak_send_features
httpserverInbound HTTP webhookFROM communitycronjob drain queue pattern
featureservOGC API Featuresstandalone binarytak_send_features