| ▲ | irshadnilam 5 hours ago | |||||||||||||||||||||||||
While I am not familiar with OPs project, I can somewhat answer this to best of my knowledge. Right now, businesses communicate with REST Apis. That is why we have API gateways like AWS Gateway, Apigee, WSO2 (company i used to work in), Kong, etc so businesses can securly deploy and expose APIS. As LLMS gets better, the idea is we will evenutally move to a world where ai agents do most of business tasks. And businesses will want to expose ai agents instead of APIS. This is where protocols like a2a comes in. Google partnering with some other giants introduced a2a protocol a while ago, it is now under linux foundation. It is a standard for one agent to talk to another agent regardless of the framework (langchain, crewai etc) that is used to build the agent. | ||||||||||||||||||||||||||
| ▲ | magackame 5 hours ago | parent [-] | |||||||||||||||||||||||||
Can't you just put the agent behind a REST API and give the other agents a curl tool + doc? | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||