langtest.modelhandler.lmstudio_modelhandler.chat_completion_api#
- chat_completion_api(text: str, url: str, server_prompt: str, **kwargs)#
Send a user text message to a chat completion API and receive the model’s response.
- Parameters:
text (str) – The user’s input text.
url (str) – The API endpoint URL.
**kwargs – Additional keyword arguments.
- Keyword Arguments:
server_prompt (str, optional) – The server prompt for the chat. Defaults to a space.
temperature (float, optional) – The temperature parameter for controlling randomness. Defaults to 0.7.
max_tokens (int, optional) – The maximum number of tokens to generate. Defaults to -1 (no limit).
stream (bool, optional) – Whether to use streaming for long conversations. Defaults to False.
- Returns:
The JSON response from the API if successful, otherwise None.
- Return type:
dict or None