We've got some experimental work to support that. Not merged yet, but you can follow https://github.com/sourcegraph/cody. I've been polling Twitter (https://twitter.com/sqs/status/1675433337354330113) to see how much people actually are using self-hosted LLMs for code completion already, and it seems like not much yet, but Code Llama with infill is a big advance that we're quite excited about.