tak-cot-sender deliberately leaves data ingestion to the rich ecosystem
of DuckDB community extensions. This page documents the extensions that
pair most naturally β each one lets you pull live data and pipe it
through tak_send() with minimal SQL.
Install community extensions with INSTALL 'extension_name' FROM community;
then LOAD 'extension_name';. No recompilation of tak-cot-sender needed.
Pull data from REST APIs or any HTTP endpoint on a schedule and forward each record as a CoT event.
Perform HTTP GET/POST requests from SQL. Ideal for one-shot or ad-hoc API polling β fetch a JSON or CSV endpoint and process it inline.
INSTALL 'http_client' FROM community;
LOAD 'http_client';
SELECT tak_send(
cot_event(id, 'a-f-G-U-C', lat, lon, alt,
9999999, 9999999, now(), now(),
now() + INTERVAL '5 min')
)
FROM read_json_auto(
http_get('https://api.example.mil/units').body
);
Schedule any SQL statement with cron syntax. Combine with http_client to build a recurring pipeline that wakes up, fetches data, and pushes CoT events automatically.
INSTALL 'cronjob' FROM community;
LOAD 'cronjob';
-- Every 15 seconds, push all active units
SELECT cronjob_create(
'units_to_tak', '*/15 * * * * *',
$$
SELECT tak_send(cot_event(
unit_id, cot_type('a','f','G-U-C'),
lat, lon, 0, 9999999, 9999999,
now(), now(), now() + INTERVAL '30s'
)) FROM active_units;
$$
);
Subscribe to real-time data streams including radio telemetry, message queues, and event-driven sources.
Real-time streaming extension from query.farm. Subscribe to change data capture (CDC) streams, Kafka topics, or event queues and process records as they arrive.
LOAD 'tributary';
-- Stream from a live topic, send each event
SELECT tak_send(
cot_event(
device_id, device_type,
payload.lat, payload.lon, 0,
9999999, 9999999,
event_time, event_time,
event_time + INTERVAL '2 min'
)
)
FROM tributary_stream('sensor_positions');
Ingest RF and radio data directly into DuckDB. APRS, ADS-B, AIS, and other radio protocols can feed position data straight into a CoT pipeline without intermediate processing.
LOAD 'radio';
-- ADS-B aircraft positions β TAK
SELECT tak_send(
cot_event(
icao_hex,
cot_type('a', 'n', 'A'), -- neutral air
lat, lon, altitude_m,
9999999, 9999999,
now(), now(),
now() + INTERVAL '60s'
)
)
FROM adsb_decode('rtl://0');
Mount remote filesystems as DuckDB virtual paths and query files as if they were local β parquet, CSV, JSON, GeoJSON, everything.
Mount WebDAV endpoints as a virtual filesystem. Query files from SharePoint, Nextcloud, or any WebDAV-compatible server with standard DuckDB file paths.
LOAD 'webdavfs';
SELECT tak_send(
cot_event(id, type_code, lat, lon, 0,
9999999, 9999999,
now(), now(), now() + INTERVAL '5 min')
)
FROM read_csv_auto(
'webdav://sharepoint.example.mil/Shared/units.csv'
);
Read files from remote servers over SSH/SFTP. Great for mission systems that push logs or position reports to a known remote host without an API.
LOAD 'sshfs';
SELECT tak_send(
cot_event(uid, 'a-f-G-U-C', lat, lon, 0,
9999999, 9999999,
ts, ts, ts + INTERVAL '5 min')
)
FROM read_parquet(
'ssh://user@sensor-host.mil/data/positions.parquet'
)
WHERE ts > now() - INTERVAL '1 minute';
Make full HTTP requests (GET, POST, PUT) with custom headers and body from SQL. Supports Bearer tokens for authenticated API calls to protected position data services.
LOAD 'http_request';
-- Authenticated API call
SELECT tak_send(
cot_event(id, 'a-f-G', lat, lon, 0,
9999999, 9999999,
now(), now(), now() + INTERVAL '5 min')
)
FROM read_json_auto(
http_request(
'GET', 'https://api.example.mil/v2/positions',
headers => {'Authorization': 'Bearer TOKEN'}
).body
);
Crawl websites and extract structured data from HTML tables or JSON-LD. Useful for scraping public data feeds (NOTAM tables, weather sites, open AIS endpoints) and forwarding to TAK.
LOAD 'crawler';
-- Crawl a public vessel tracking page
SELECT tak_send(
cot_event(
mmsi, cot_type('a','n','S-X-M'),
lat, lon, 0, 9999999, 9999999,
now(), now(), now() + INTERVAL '10 min')
)
FROM crawl('https://ais.example.org/vessels')
WHERE type = 'cargo';
Translate vector layers, feature services, and geometry data into TAK-ready CoT events using tak-cot-sender's GIS Translator.
DuckDB's official spatial extension provides GEOMETRY types,
ST_Read() for dozens of vector formats (Shapefile,
GeoPackage, GeoJSON, FlatGeobuf), and spatial operations like
centroid, buffer, and intersection.
LOAD 'spatial';
-- Publish all features from a GeoPackage
SELECT *
FROM tak_send_features(
$$
SELECT geom, asset_id AS uid,
cot_symbol AS type
FROM ST_Read('assets.gpkg')
WHERE status = 'active'
$$,
'uid', 'type'
);
-- Send polygon centroids individually
SELECT tak_send(
cot_from_geometry(geom, id, 'a-f-G')
)
FROM ST_Read('zones.geojson');
Expose DuckDB tables as OGC API Features endpoints. Combined with tak-cot-sender, you can host a geospatial REST API and forward those same features to TAK Server from the same DuckDB process.
-- Read from a remote featureserv endpoint
SELECT tak_send(
cot_from_geometry(
ST_GeomFromGeoJSON(geometry),
properties.id,
properties.cot_type
)
)
FROM read_json_auto(
'https://featureserv.mil/api/features/vehicles'
)
WHERE properties.updated_at > now() - INTERVAL '2 min';
Flip the architecture: expose DuckDB as an HTTP endpoint and let external systems push events to you, which you then forward to TAK.
Run an HTTP server inside DuckDB. Define endpoints that accept JSON payloads, insert them into a table, and a background cronjob drains the queue to TAK. Effectively turns DuckDB into a lightweight CoT gateway.
LOAD 'httpserver';
-- Accept position reports via POST
SELECT httpserver_start('0.0.0.0', 9999);
CREATE TABLE inbound_positions (
uid TEXT,
lat DOUBLE,
lon DOUBLE,
alt DOUBLE,
ts TIMESTAMPTZ
);
SELECT httpserver_endpoint(
'/position',
$$INSERT INTO inbound_positions
SELECT * FROM read_json_auto(:body)$$
);
-- Drain queue every 5 seconds β TAK
SELECT cronjob_create(
'drain_to_tak', '*/5 * * * * *',
$$
SELECT tak_send(cot_event(
uid, 'a-f-G-U-C',
lat, lon, alt,
9999999, 9999999,
ts, ts,
ts + INTERVAL '5 min'
))
FROM inbound_positions;
DELETE FROM inbound_positions;
$$
);