> The Claude 3 models also offer a huge 200K token context window. Cody doesn’t use this entire context window today; we limit Cody’s context window to roughly 7K tokens.
in their discord they mentioned they are supposedly working on increasing the context window but provided no eta. 7k is not enough.
i've been switching between WebStorm and VS Code to use cody and copilot together. i find that copilot does better with autocomplete and cody chat is better since i can choose between multiple LLMs like Mixtral 8x7B which is working better for my react project.
I have been trying it with WebStorm. Pretty solid. The autocomplete is a little more accurate compared to Copilot. Not a fan of the chat and the fact that the VS Code extension has more features but whatever it's free for now.