Agents
Autonomous actors powered by large language models.
Ockam Agents are intelligent autonomous actors that accomplish complex open ended tasks. Python apps, running in Ockam AI, can create millions of parallel collaborating agents in seconds.
Each agent:
- Has a unique identity.
- Uses a large language model to understand natual language.
- Has memory of conversations.
- Makes plans that break down complex tasks into small iterative steps.
- Can retrieve knowledge beyond the training data of its language model.
- Invokes tools to gather new information and take actions.
- Collaborates with and can delegate work to other agents.
Build
You can find all the code for the following example on Github.
Let’s build a new app, that provides a Legal AI Agent. Within the app directory, create the following three files:
- The
ockam.yaml
configuration file defines how to deploy a Zone in your Cluster in Ockam AI. - The
images
directory contains the source code of docker images that will be used to run containers in your Zone.- Inside
images
, there is a directory for themain
image.Dockerfile
describe how the main image will be compiled.main.py
is the Python program that is run by the main image.
- Inside
The ockam.yaml
configuration file defines how to deploy your Zone:
- Create a Zone named
01
. - Create a Pod named
main-pod
inside the01
Zone. - Make the HTTP server, in the
main
container in this pod, public. - Run a
main
container using themain
image.
The Dockerfile
bases the main
image on the ockam-python
image which already contains the ockam
python package. The Dockerfile
then copies the contents of the images/main
directory into the image and sets main.py
as the program to run when the container is started.
The main.py
file:
- Turns your Python app into an Ockam Node. An Ockam Node in your Cluster can connect with and deliver messages to any other Ockam Node in your Cluster that is running in Ockam AI.
- After the Node is initialized, it invokes the
main
function defined in yourmain.py
file. Themain
function starts an agent namehenry
with specific instructions. It then sends a message to this agent and prints the agent’s reply. Node.start
also starts an HTTP server within this Python app.
Run
Check that you have Docker installed and it is running on your workstation before running the ockam
command.
Run the ockam
command in the directory that has the ockam.yaml
, it will:
- Build each image in
images/
and push them to a container registry that is available to your Zone. - Deploy the Zone into your Cluster, in Ockam AI, based on the configuration that you specified above in
ockam.yaml
. - Open two portal inlets on your workstation that connect to corresponding outlets in the
main-pod
that is running in your Zone in Ockam AI.
Logs
You can see the logs for all containers in your deployed zone at http://localhost:3000
.
HTTP
By default the pod will expose a HTTP interface with a few endpoints interact with the deployed agents.
The public URL for the HTTP server on your pod (each pod will have its own unique URL) is displayed in the example output above:
The http server on the main-pod is available at: https://25df35de87aa441b88f22a6c2a830a17-01.ai.ockam.network
To return a list of all agents send a GET
request to the /agents
endpoint:
To interact with a specific agent, send a POST
request to the /agents/<name>
endpoint:
For some use cases, especially user-facing ones with potentially long responses, it may be preferable to stream the response. Passing a stream=true
parameter to the /agents/<name>
endpoint will stream chunked responses from the agent in real-time: