inland-turquoise
inland-turquoise•2mo ago

How to integrate composio with my specific use case?

I am currently hosting my custom model on a hugging face space, and I am connecting that with my frontend using a gradio. This maybe vague, but is there any way possible to integrate composio here?
15 Replies
adverse-sapphire
adverse-sapphire•2mo ago
Hey @Lovelin Dhoni, so as per what I can understand is, you have hosted your custom model on hugging face which you will treat it as your llm and you want to use composio tools with it, am I right?
inland-turquoise
inland-turquoise•2mo ago
yes that is it
adverse-sapphire
adverse-sapphire•2mo ago
Cool it's awesome, as soon as you can create a llm using your hugging face model, you are good to connect it with any framework of your choice like langchain, crewAI, etc and add your required composio tools at there. For reference you can see this example, we have used open source LLM from huggingface here: https://docs.composio.dev/guides/python/news-summary
Composio Docs
News Summary - Composio
Composio enables your agents to connect with Various Tools and work with them
adverse-sapphire
adverse-sapphire•2mo ago
Let me know, if you need any other help 🙌
inland-turquoise
inland-turquoise•2mo ago
thanks @thatsmeadarsh, some of its features are a overkill to me usecase. Here is a the file https://github.com/lovelindhoni/flask-api/blob/main/index.py
GitHub
flask-api/index.py at main · lovelindhoni/flask-api
Contribute to lovelindhoni/flask-api development by creating an account on GitHub.
inland-turquoise
inland-turquoise•2mo ago
can u just give me high level overview of how it can be done
adverse-sapphire
adverse-sapphire•2mo ago
It would be my pleasure to help you, before that, could you tell me what exactly you want to achieve by using your custom model and composio tools? @Lovelin Dhoni
inland-turquoise
inland-turquoise•2mo ago
We are planning to integrate composio's tools into our pipeline. This is more of like a POC.
adverse-sapphire
adverse-sapphire•2mo ago
Would you like to come at # debug-help for a short conversation over this?? Let me solve your issue there @Lovelin Dhoni
inland-turquoise
inland-turquoise•2mo ago
the flask file is minimal, it has a post method, in which the hugging face model is called using gradio then the response is returned
adverse-sapphire
adverse-sapphire•2mo ago
You want to create a route which will take up some data, invoke the agent with your custom models and composio tools and in return it will send response whether it was successful or not (or whatever the agent gave as response)
adverse-sapphire
adverse-sapphire•2mo ago
https://docs.composio.dev/guides/python/calendar-agent You can just do that like here, just add up in your route function
Composio Docs
Calendar Agent - Composio
Composio enables your agents to connect with Various Tools and work with them
adverse-sapphire
adverse-sapphire•2mo ago
Instead of using openai, add your own llm at here:
calendar_agent = Agent(
role="Google Calendar Agent",
goal="""You take action on Google Calendar using Google Calendar APIs""",
backstory="""You are an AI agent responsible for taking actions on Google Calendar on users' behalf.
You need to take action on Calendar using Google Calendar APIs. Use correct tools to run APIs from the given tool-set.""",
verbose=True,
tools=tools,
llm=llm, # add your llm here
)
calendar_agent = Agent(
role="Google Calendar Agent",
goal="""You take action on Google Calendar using Google Calendar APIs""",
backstory="""You are an AI agent responsible for taking actions on Google Calendar on users' behalf.
You need to take action on Calendar using Google Calendar APIs. Use correct tools to run APIs from the given tool-set.""",
verbose=True,
tools=tools,
llm=llm, # add your llm here
)
Hope, this resources works to you. If you get in any trouble with the process, you can ping me here @Lovelin Dhoni
inland-turquoise
inland-turquoise•2mo ago
what does the pull function does? like pull(""hwchase17/react-json"), from where it exactly pulls
adverse-sapphire
adverse-sapphire•2mo ago
LangChain Hub is a tool that allows users to discover, share, and experiment with prompts for LangChain and Large Language Models (LLMs). Here we used it for prompt template