Comment by pryelluw Comment by pryelluw 7 days ago 4 replies Copy Link View on Hacker News Any plans to support local models through llama.cpp or similar?
Copy Link hugs 6 days ago Collapse Comment - 100% yes. favorites? Reply View | 3 replies Copy Link pryelluw 6 days ago Parent Collapse Comment - I daily drive llama.cpp so that please. Reply View | 2 replies Copy Link hugs 6 days ago Root Parent Collapse Comment - which local models? (e.g. qwen, llama, mistral?) Reply View | 1 reply Copy Link pryelluw 5 days ago Root Parent Collapse Comment - Oh qwen mostly. The smaller ones > 10B. Happy to be a tester ! Email in profile Reply View | 0 replies
Copy Link pryelluw 6 days ago Parent Collapse Comment - I daily drive llama.cpp so that please. Reply View | 2 replies Copy Link hugs 6 days ago Root Parent Collapse Comment - which local models? (e.g. qwen, llama, mistral?) Reply View | 1 reply Copy Link pryelluw 5 days ago Root Parent Collapse Comment - Oh qwen mostly. The smaller ones > 10B. Happy to be a tester ! Email in profile Reply View | 0 replies
Copy Link hugs 6 days ago Root Parent Collapse Comment - which local models? (e.g. qwen, llama, mistral?) Reply View | 1 reply Copy Link pryelluw 5 days ago Root Parent Collapse Comment - Oh qwen mostly. The smaller ones > 10B. Happy to be a tester ! Email in profile Reply View | 0 replies
Copy Link pryelluw 5 days ago Root Parent Collapse Comment - Oh qwen mostly. The smaller ones > 10B. Happy to be a tester ! Email in profile Reply View | 0 replies
100% yes. favorites?