Comment by mirekrusin
Comment by mirekrusin 9 hours ago
...it gets better:
GitHub Copilot is a service, you can buy subscription from here https://github.com/features/copilot.
GitHub Copilot is available from website https://github.com/copilot together with services like Spark (not available from other places), Spaces, Agents etc.
GitHub Copilot is VSCode extension which you can download at https://marketplace.visualstudio.com/items?itemName=GitHub.c... and use from VSCode.
New version has native "Claude Code" integration for Anthropic models served via GitHub Copilot.
You can also use your own ie. local llama.cpp based provider (if your github copilot subscription has it enabled / allows it at enterprise level).
Github Copilot CLI is available for download here https://github.com/features/copilot/cli and it's command line interface.
Copilot for Pull Requests https://githubnext.com/projects/copilot-for-pull-requests
Copilot Next Edit Suggestion https://githubnext.com/projects/copilot-next-edit-suggestion...
Copilot Workspace https://githubnext.com/projects/copilot-workspace/
Copilot for Docs https://githubnext.com/projects/copilot-for-docs/
Copilot Completions CLI https://githubnext.com/projects/copilot-completions-cli/
Copilot Voice https://githubnext.com/projects/copilot-voice/
GitHub Copilot Radar https://githubnext.com/projects/copilot-radar/
Copilot View https://githubnext.com/projects/copilot-view/
Copilot Labs https://githubnext.com/projects/copilot-labs/
This list doesn't include project names without Copilot in them like "Spark" or "Testpilot" https://githubnext.com/projects/testpilot etc.
Since we're talking about GitHub Copilot I'll lodge my biggest complaint about it here! The context window is stuck at 128k for all models (except maybe Codex): https://github.com/microsoft/vscode/issues/264153 and https://github.com/anomalyco/opencode/issues/5993
This absolutely sucks, especially since tool calling uses tokens really really fast sometimes. Feels like a not-so-gentle nudge to using their 'official' tooling (read: vscode); even though there was a recent announcement about how GHCP works with opencode: https://github.blog/changelog/2026-01-16-github-copilot-now-...
No mention of it being severely gimped by the context limit in that press release, of course (tbf, why would they lol).
However, if you go back to aider, 128K tokens is a lot, same with web chat... not a total killer, but I wouldn't spend my money on that particular service with there being better options!