Streaming

Overview

Some API endpoints return streaming responses instead of standard paginated responses. Streaming endpoints provide long-running connections that continuously deliver mentions as they match your filters.

Key differences from paginated endpoints:

  • Connection type - Long-running persistent connection vs. individual HTTP requests
  • Data volume - Unlimited data stream vs. fixed batches (25 mentions per request)
  • Response format - Continuous event stream vs. JSON objects with cursor pagination
  • Use case - Bulk data export and real-time monitoring vs. exploration and small datasets

Supported stream types:

  • Export streams - Bulk export of historical or recent data
  • Live streams - Real-time mentions as they are collected by Listen platform

Supported formats

Server-Sent Events (SSE)

Use Accept: text/event-stream header to receive data in SSE format:

curl -N -X POST \
  'https://sentione.com/api/public/v2/projects/111/mentions/search/stream' \
  -H 'X-API-KEY: your_secret_key_here' \
  -H 'Content-Type: application/json' \
  -H 'Accept: text/event-stream' \
  -d '{"filters":{},"sortType":"PublishedAtDescending"}'

SSE Response format:

event: mention
data: {"id": "mention_id", "author": {...}, "content": {...}}
id: mention_id

event: heartbeat
data: {"date": "2025-05-22T10:30:00Z"}
id: 2025-05-22T10:30:00Z

event: mention
data: {"id": "another_mention_id", "author": {...}, "content": {...}}
id: another_mention_id

JSON Streaming

Use Accept: application/json header to receive data as JSON lines:

curl -N -X POST \
  'https://sentione.com/api/public/v2/projects/111/mentions/search/stream' \
  -H 'X-API-KEY: your_secret_key_here' \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json' \
  -d '{"filters":{},"sortType":"PublishedAtDescending"}'

JSON Response format:

{"event": "mention", "data": {"id": "mention_id", "author": {...}, "content": {...}}, "id": "mention_id"}
{"event": "heartbeat", "data": {"date": "2025-05-22T10:30:00Z"}, "id": "2025-05-22T10:30:00Z"}
{"event": "mention", "data": {"id": "another_mention_id", "author": {...}, "content": {...}}, "id": "another_mention_id"}

Stream types

There are two types of streaming endpoints available:

FeatureExport StreamsLive Streams
PurposeBulk data exportReal-time monitoring
Data orderSorted (configurable)Random order
Data rangeHistorical + RecentRecent only (7 days)
DurationFinite (ends when data exhausted)Infinite (continuous)
BackfillNot supportedOptional 0-5 minutes
SortingFull sorting supportNo sorting

Working with streams

Connection management

  • Long-running connections - Streams can run for hours or indefinitely
  • Heartbeat monitoring - Use heartbeat events to detect connection health
  • Reconnection logic - Implement automatic reconnection for production use

Processing guidelines

  • Non-blocking processing - Process events asynchronously to avoid blocking the stream
  • Incremental processing - Handle events as they arrive, don't wait for completion
  • Error handling - Implement robust error handling for network issues
  • Buffering - Consider local buffering for critical events

Best practices

  1. Choose appropriate format - Use SSE or JSON based on your preference and implementation needs
  2. Monitor connection health - Use heartbeat events to detect issues
  3. Implement reconnection - Automatic reconnection for production systems
  4. Handle backpressure - Ensure your processing can keep up with incoming data
  5. Use filters effectively - Reduce unnecessary data with precise filters

Common use cases

  • Data export - Bulk extraction of historical mentions
  • Real-time monitoring - Live brand monitoring and crisis detection
  • Analytics pipelines - Streaming data into analytics systems
  • Alert systems - Real-time notifications based on mention criteria
  • Competitive intelligence - Continuous monitoring of competitor mentions