Thanks Wappler Team for hearing users.
Now we have Custom Chat Provider but now it is not working we need to select models and list models is empty.
I think this is because the selection of models has to be handled by the endpoint: BaseURL/api/v1/models
Hence the conclusion, you either need to add an input field for the Models Endpoint variable, or solve process BaseURL/models
Or as seen here:
Add an input field for the Models selector anthropic/claude-3.5-sonnet, gryphe/mythomax-l2-13b
To use the method models: ['anthropic/claude-3.5-sonnet', 'gryphe/mythomax-l2-13b']
I’m trying to use the new AI integration in Wappler (version 7b29) and I’m having trouble connecting to both a local LM Studio instance and the Groq API. In both cases, no models are shown in the dropdown selector, even though the /models endpoint works perfectly when tested outside of Wappler.
I tested the /models endpoint using curl from bash: curl http://192.168.0.xx:xxxx/v1/models
and it returned a valid JSON with a full list of available models (e.g. meta-llama-3-8b-instruct, phi-3-mini-4k-instruct, etc.).
No API Key is required.
Despite this, Wappler does not populate the model list.