OpenAI

The OpenAI connection allows the Intelligence Hub to interact with OpenAI models for generative AI and LLM workflows. This connection enables sending prompts and receiving responses from models hosted by OpenAI. Below are details on the connection and input settings.

Connection Settings

Setting Description
Password The password used to authenticate with the OpenAI service. This is typically an API key.
Base URL (Optional) Base URL for the OpenAI API. If not provided, the default endpoint is used. Many LLM engines support the OpenAI API, and this can be used to interact with local LLM endpoints (ex. LM Studio).

Input Settings

Setting Description
Model An identifier for the model to use. This is typically a model name or ID provided by OpenAI. Common models are “gpt-4” or “o4-mini” for example.
Instructions (Optional) The prompt or instructions to guide the model’s response.
Message The message to be sent to the model. This is the input the model will process based on the instructions.
Response Format The format in which the model should respond. When set to JSON the connection will attempt to parse the response as JSON. If the response is not valid JSON the read fails.