Before you start, ensure the base_url and api_key refer to FriendliAI.
Since our products are entirely compatible with OpenAI SDK, now you are good to follow the examples below.Choose one of the available models for model parameter.
Chat completion API that generates a response from a given conversation.We provide multiple usage examples. Try to find the best one that aligns with your needs.
This feature is in Beta and available only on the Serverless Endpoints.
Using tool assisted chat completion API, models can utilize built-in tools prepared for tool calls, enhancing its capability to provide more comprehensive and actionable responses.Available tools are listed here.
Copy
Ask AI
import osfrom openai import OpenAIclient = OpenAI( base_url="https://api.friendli.ai/serverless/tools/v1", api_key=os.environ.get("FRIENDLI_TOKEN"))stream = client.chat.completions.create( model="meta-llama-3.1-8b-instruct", messages=[{"role": "user", "content": "What is the current average home price in New York City, and if I put 15% down, how much will my mortgage be?"}], tools=[ {"type": "web:search"}, {"type": "math:calculator"}, ], stream=True,)for chunk in stream: if chunk.choices is None: print(f"{chunk.event=}, {chunk.data=}") elif chunk.choices[0].delta.content is not None: print(chunk.choices[0].delta.content, end="")