Comment by pryelluw
Can it be adapted to use ollama? Seems like a good tool to setup locally as a navigation tool.
Can it be adapted to use ollama? Seems like a good tool to setup locally as a navigation tool.
We accidentally didn't release the right types for LLMClient :/ However, if you set the version in package.json to "alpha", it will install what's on the main branch on GitHub, which should have the typing fix there
Yeah I saw it was a recent change in your GitHub and was happily running your examples.
To be honest I took about 2 minutes of playing around to get annoyed with the inaccuracies of the locally hosted model for that, so I get why you encourage the other approaches.
Yes, you can certainly use Ollama! However, we strongly recommend using a more beefed up model to get sustainable results. Check out our external_client.ts file in examples/ that shows you how to setup a custom LLMClient: <https://github.com/browserbase/stagehand/blob/main/examples/...>