-
WhatI am thinking of swapping models like DeepSeek, Ollama, etc. apart from Claude. WhyIt will give more flexibility to the user over which specific model llm-model they want to use. Also using an open-source model like Ollama will be helpful as they can be set up and run locally very easily. ExampleNo response |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 1 reply
-
|
The current model from Anthropic uses Computer use. |
Beta Was this translation helpful? Give feedback.
-
|
I personally would love this. I used shortest and within 10 minutes used up 10$ on claude. Hence, I stopped using the framework even though it provides amazing value. I would 100% start using this if I can setup a local LLM. |
Beta Was this translation helpful? Give feedback.
-
|
Feel free to do it! |
Beta Was this translation helpful? Give feedback.
-
|
https://github.com/simular-ai/Agent-S This can be used here, I guess. Will try to test it and see if it works. |
Beta Was this translation helpful? Give feedback.
Once we migrate to Vercel AI SDK, it should be easier adding support for new providers, once those models have support for computer-use tools.