API Reference
Complete reference for the SensAI REST API, including authentication, available endpoints, and supported AI models.
Overview
The SensAI API is a RESTful service built with FastAPI that provides access to a curated selection of AI models through a unified interface. All AI model requests are routed through OpenRouter, giving you access to models from multiple providers without managing separate API keys for each.
The base URL for all API requests is:
https://api.sensai.jmrinfotech.com/api/v1Available Sections
- Authentication — Learn how to authenticate your API requests using Bearer tokens
- Endpoints — Detailed reference for all available API endpoints
- Models — Browse the curated list of AI models available through SensAI
Response Format
All API responses are returned as JSON. Streaming endpoints (such as the chat stream) return server-sent events (SSE) that you can consume in real time.
Error responses follow a consistent format with an appropriate HTTP status code and a JSON body containing an error message and details to help you diagnose and resolve issues quickly.
Streaming Responses
For the chat stream endpoint, the API returns a series of server-sent events where each event contains a partial response chunk. Your client should consume these events incrementally to display tokens as they arrive, providing a responsive and interactive user experience.
Error Handling
All error responses include a message field describing the issue and, where applicable, a details field with additional context. Common HTTP status codes include 401 for authentication failures, 422 for validation errors, and 429 when rate limits are exceeded.