Deploying Your Tools to an Endpoint
The final step before you can start using your tools in an AI application is to deploy the tools to an endpoint. To deploy your tools, you need to have the watsonx.ai Flows Engine Node.js CLI installed and sign up for a free account.
Deploying
You can deploy the tools to an endpoint running in the cloud using the following command:
wxflows deploy
Once the deployment succeeds, the endpoint where the tools are deployed will be printed in your terminal. It will look something like this:
https://USERNAME.REGION.ibm.stepzen.net/api/my-project/graphql
When sending a request to the endpoint directly, you should use a GraphQL IDE such as API Connect for GraphQL or Altair.
The endpoint is protected by your API Key, which you can retrieve from the dashboard or by running the following command in your terminal:
wxflows whoami --apikey
Add the API Key in an
{
"Authorization": "apikey YOUR_WXFLOWS_APIKEY"
}
Use the Endpoint in Your Application
To use this endpoint in an application, you should use the watsonx.ai Flows Engine SDK for JavaScript or Python. The SDK can be used with popular libraries for building AI applications, such as LangChain and LangGraph. It can also be used with SDKs provided by LLM providers such as watsonx.ai and OpenAI.
See the GitHub repository for examples or an end-to-end chat application you can use with your endpoint.