RLM

Recursive Language Model endpoints.

Create an RLM response

post

OpenAI Responses-style endpoint for Recursive Language Model execution. Accepts a strict subset of the Responses request shape and proxies to the internal RLM backend.

Authorizations
AuthorizationstringRequired

API key in Authorization header using Bearer .

Body

Strict OpenAI Responses-style request supported by the RLM gateway endpoint.

modelstringRequired

Model slug to run through the RLM backend.

inputany ofRequired

String input or a text-only array of Responses message items.

stringOptional
or
RLMResponsesInputMessage[]Optional
instructionsany ofOptional

Optional top-level instructions mapped to the backend root prompt.

stringOptional
or
nullOptional
reasoningany ofOptional

Optional reasoning controls.

nullOptional
metadataany ofOptional

Optional metadata echoed in the gateway response.

or
nullOptional
storeany ofOptional

Must be false or omitted. Stateful Responses storage is not supported.

booleanOptional
or
nullOptional
streamany ofOptional

Must be false or omitted. Streaming is not supported for this route.

booleanOptional
or
nullOptional
rlmany ofOptional

RLM-specific execution controls.

nullOptional
Responses
chevron-right
200

Successful Response

application/json
idstringRequired
objectconst: responseOptionalDefault: response
created_atintegerRequired
statusconst: completedOptionalDefault: completed
modelstringRequired
storebooleanOptionalDefault: false
output_textstringRequired
usageany ofOptional
or
nullOptional
post
/v1/rlm/responses

Last updated