RLM
Authorizations
AuthorizationstringRequired
API key in Authorization header using Bearer .
Body
Strict OpenAI Responses-style request supported by the RLM gateway endpoint.
modelstringRequired
Model slug to run through the RLM backend.
inputany ofRequired
String input or a text-only array of Responses message items.
stringOptional
RLMResponsesInputMessage[]Optional
instructionsany ofOptional
Optional top-level instructions mapped to the backend root prompt.
stringOptional
nullOptional
reasoningany ofOptional
Optional reasoning controls.
nullOptional
metadataany ofOptional
Optional metadata echoed in the gateway response.
or
nullOptional
storeany ofOptional
Must be false or omitted. Stateful Responses storage is not supported.
booleanOptional
nullOptional
streamany ofOptional
Must be false or omitted. Streaming is not supported for this route.
booleanOptional
nullOptional
rlmany ofOptional
RLM-specific execution controls.
nullOptional
Responses
200
Successful Response
application/json
idstringRequired
objectconst: responseOptionalDefault:
responsecreated_atintegerRequired
statusconst: completedOptionalDefault:
completedmodelstringRequired
storebooleanOptionalDefault:
falseoutput_textstringRequired
usageany ofOptional
or
nullOptional
400
Bad Request
application/json
401
Unauthorized
application/json
403
Forbidden
application/json
404
Not Found
application/json
409
Conflict
application/json
429
Too Many Requests
application/json
500
Internal Server Error
application/json
501
Not Implemented
application/json
503
Upstream Unavailable
application/json
post
/v1/rlm/responsesLast updated