Skip to content

Support Alpaca and Llama models #8

Open
@samheutmaker

Description

@samheutmaker

Autodoc is currently reliant on OpenAI for access to cutting-edge language models. Going forward, we would like to support models running locally or at providers other than OpenAI, like Llama, or Alpaca. This gives developers more control over how their code is indexed, and allows indexing of private code that cannot be shared with OpenAI.

This is a big undertaking that will be an on-going process. A few thoughts for someone who wants to get starting hacking on this.

  1. It would be nice to be able to configure Autodoc with a LangChain LLM via the Autodoc config file. This would allow for complete control over how an LLM is configured.
  2. It seems like a lot of people are using llamma.cpp to run llamma locally. It may be worth using this as a starting point to support other models.

This issue is high priority. If you're interesting in working on it, please reach out.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions