How to configure an answer generator with LLM for question answering?

If you are not fully satisfied with the default LLM used to generate answers in your project, you can use and experiment other off-the-shelf LLMs.

  • Go to Processing view
  • Create a new answer generator
  • Give a name
  • Click on “Create from default configuration”
  • Browse the list or – better – type “Text gen…” as keyword to filter the result list
  • Select a cloud provider:
    • OpenAI for GPT models (need a key)
    • Microsoft Azure for GPT models (need a key)
    • DeepInfra for OpenSource models including Llama2, Mistral 7B, Mixtral 8*7B, Dolphin-Mixtral 8*7B (need a key)
  • Select the LLM
  • Edit the default prompt if you want to adapt it
  • Don’t forget to save
  • Mark it with the yellow star if you want to use it by default in the question answering interface