Skip to main content

High-level flow

  1. ONVY receives provider or application data.
  2. Raw payloads are stored and queued for processing.
  3. Calculation workers harmonize data into stable record families.
  4. Downstream services generate summaries, scores, and webhook events.

Terra ingestion pattern

The Terra pipeline is the main example of delayed ingestion:
  1. A Terra webhook writes raw data and schedules work in DynamoDB.
  2. A delayed SQS message gives related payloads time to accumulate.
  3. The calculation Lambda processes scheduled items in batches by user.
  4. Harmonized outputs become available through routes such as daily_records, activities, and baselines.
That short delay reduces duplicate work when several provider payloads arrive back-to-back.

What processing produces

  • daily_records for user-facing scores, zones, and logs
  • facts for durable AI personalization context
  • activities and meals for structured event history
  • ai_summaries when a supported summary flow runs

AI surfaces in the pipeline

AI features consume the same underlying user context:
  • Chat completions enrich prompts with ONVY context unless the request opts out of selected sections
  • AI summaries persist generated results so you can list, fetch, and audit them later
Internal EventBridge events use a flat detail.data shape. The batched events[] envelope described in /webhooks is only for external webhook delivery.