Skip to content

Provide the utility to count the token saved using Semantic Cache #182

Open
@pierrelambert

Description

@pierrelambert

It would be nice to have an optional mode permitting to monitor the number of token the Semantic Cache permits to save.
To do so, today we can store the token as metadata item.

from langchain_community.callbacks import get_openai_callback

with get_openai_callback() as cb:
        response = agent.run(input=user_query, callbacks=[retrieval_handler])
        red.set("query_token_total",cb.total_tokens)
        red.incr("session_token",cb.total_tokens)
    return response

(...)

metadata=dict()
                if int(red.get("query_token_total").decode('utf-8'))!=0:
                    metadata["token"]= int(red.get("query_token_total").decode('utf-8'))
                    llmcache.store(user_query, response, metadata=metadata)
                else:
                    llmcache.store(user_query, response)

I am wondering if it would make sense to make such easier as it permits to highlight the Semantic Cache value.

Metadata

Metadata

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions