Skip to content

Server: add function calling API #5588

Closed
@ngxson

Description

@ngxson

Motivation

This subject is already brought up in #4216 , but my initial research failed.

Recently, I discovered a new line of model designed specifically for this usage: https://github.com/MeetKai/functionary

This model can decide whether to call functions (and which function to be called) in a given context. The chat template looks like this:

{#v2.2#}
{% for message in messages %}
  {% if message['role'] == 'user' or message['role'] == 'system' %}
    {{ '<|from|>' + message['role'] + '\n<|recipient|>all\n<|content|>' + message['content'] + '\n' }}
  {% elif message['role'] == 'tool' %}
    {{ '<|from|>' + message['name'] + '\n<|recipient|>all\n<|content|>' + message['content'] + '\n' }}
  {% else %}
    {% set contain_content='no'%}
    {% if message['content'] is not none %}
      {{ '<|from|>assistant\n<|recipient|>all\n<|content|>' + message['content'] }}
      {% set contain_content='yes'%}
    {% endif %}
    {% if 'tool_calls' in message and message['tool_calls'] is not none %}
      {% for tool_call in message['tool_calls'] %}
        {% set prompt='<|from|>assistant\n<|recipient|>' + tool_call['function']['name'] + '\n<|content|>' + tool_call['function']['arguments'] %}
        {% if loop.index == 1 and contain_content == \"no\" %}
          {{ prompt }}
        {% else %}
          {{ '\n' + prompt}}
        {% endif %}
      {% endfor %}
    {% endif %}
    {{ '<|stop|>\n' }}
  {% endif %}
{% endfor %}
{% if add_generation_prompt %}
  {{ '<|from|>assistant\n<|recipient|>' }}
{% endif %}

Example:

<|from|>system
<|recipient|>all
<|content|>// Supported function definitions that should be called when necessary.
namespace functions {
// Get the current weather
type get_current_weather = (_: {
// The city and state, e.g. San Francisco, CA
location: string,
}) => any;
} // namespace functions
<|from|>system
<|recipient|>all
<|content|>A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. The assistant calls functions with appropriate input when necessary
<|from|>user
<|recipient|>all
<|content|>What is the weather for Istanbul?

Possible implementation

Since this is the only one model available publicly that can do this function, it's quite risky to modify llama_chat_apply_template to support it (we may end up pollute the code base).

The idea is to firstly keep the implementation in server example, then when the template become more mainstream, we can adopt it in llama_chat_apply_template.

Data passing in the direction from user ==> model (input direction)

  • Add function in server example to parse input request and format the prompt. Attention: with function calling, we will have 2 types of system messages: one for the actual prompt (You are a helpful assistant) and one for function definition.

Data passing in the direction from model ==> user (output direction)

  • Add grammar to for model to output JSON when it's inside function argument message
  • Add parser to extract function arguments and return it as JSON

Metadata

Metadata

Assignees

No one assigned

    Labels

    demoDemonstrate some concept or idea, not intended to be mergedenhancementNew feature or requestserver/webuistale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions