Tech Tip: Resolving “model 'gpt-4o-Mini' Not Found” with 4D AIKit Providers
PRODUCT: 4D | VERSION: 21 | PLATFORM: Mac & Win
Published On: February 2, 2026
When using 4D AIKit, developers often rely on default parameters to simplify early implementation. If no model is specified, AIKit automatically falls back to gpt-4o-mini. This behavior works seamlessly with OpenAI and usually requires no additional configuration.
This default becomes problematic when working with OpenAI-compatible providers such as Groq, Ollama, LM Studio, or Azure OpenAI. Although these providers follow the OpenAI API specification, they do not expose OpenAI model names. Each provider defines its own model identifiers or deployment names.
As a result, if no model is explicitly provided, AIKit still sends gpt-4o-mini in the request. Since this model does not exist on those platforms, the request fails with “model not found” or similar errors, even though the API key and base URL are correctly configured.
Example of a failing request:
The fix is to always specify a valid model when using providers other than OpenAI.
Example with a local Ollama provider:
This default becomes problematic when working with OpenAI-compatible providers such as Groq, Ollama, LM Studio, or Azure OpenAI. Although these providers follow the OpenAI API specification, they do not expose OpenAI model names. Each provider defines its own model identifiers or deployment names.
As a result, if no model is explicitly provided, AIKit still sends gpt-4o-mini in the request. Since this model does not exist on those platforms, the request fails with “model not found” or similar errors, even though the API key and base URL are correctly configured.
Example of a failing request:
| var $client : cs.AIKit.OpenAI var $messages : Collection var $result : Object $client:=cs.AIKit.OpenAI.new({\ apiKey: "your-key"; \ baseURL: "https://provider.here/v1"\ }) $messages:=New collection $messages.push({role: "user"; content: "What are the models of openai"}) // Fails because gpt-4o-mini does not exist on Ollama $result:=$client.chat.completions.create($messages) |
The fix is to always specify a valid model when using providers other than OpenAI.
Example with a local Ollama provider:
| $client:=cs.AIKit.OpenAI.new({\ apiKey: "ollama"; \ baseURL: "http://localhost:11434/v1"\ }) $result:=$client.chat.completions.create($messages; {model: "llama3"}) |