Skip to content

together

pixeltable.functions.together

Pixeltable UDFs that wrap various endpoints from the Together AI API. In order to use them, you must first pip install together and configure your Together AI credentials, as described in the Working with Together AI tutorial.

chat_completions

chat_completions(
    messages: JsonT,
    *,
    model: str,
    max_tokens: Optional[int] = None,
    stop: Optional[JsonT] = None,
    temperature: Optional[float] = None,
    top_p: Optional[float] = None,
    top_k: Optional[int] = None,
    repetition_penalty: Optional[float] = None,
    logprobs: Optional[int] = None,
    echo: Optional[bool] = None,
    n: Optional[int] = None,
    safety_model: Optional[str] = None,
    response_format: Optional[JsonT] = None,
    tools: Optional[JsonT] = None,
    tool_choice: Optional[JsonT] = None
) -> JsonT

Generate chat completions based on a given prompt using a specified model.

Equivalent to the Together AI chat/completions API endpoint. For additional details, see: https://docs.together.ai/reference/chat-completions-1

Requirements:

  • pip install together

Parameters:

  • messages (JsonT) –

    A list of messages comprising the conversation so far.

  • model (str) –

    The name of the model to query.

For details on the other parameters, see: https://docs.together.ai/reference/chat-completions-1

Returns:

  • JsonT

    A dictionary containing the response and other metadata.

Examples:

Add a computed column that applies the model mistralai/Mixtral-8x7B-v0.1 to an existing Pixeltable column tbl.prompt of the table tbl:

>>> messages = [{'role': 'user', 'content': tbl.prompt}]
... tbl['response'] = chat_completions(messages, model='mistralai/Mixtral-8x7B-v0.1')

completions

completions(
    prompt: str,
    *,
    model: str,
    max_tokens: Optional[int] = None,
    stop: Optional[JsonT] = None,
    temperature: Optional[float] = None,
    top_p: Optional[float] = None,
    top_k: Optional[int] = None,
    repetition_penalty: Optional[float] = None,
    logprobs: Optional[int] = None,
    echo: Optional[bool] = None,
    n: Optional[int] = None,
    safety_model: Optional[str] = None
) -> JsonT

Generate completions based on a given prompt using a specified model.

Equivalent to the Together AI completions API endpoint. For additional details, see: https://docs.together.ai/reference/completions-1

Requirements:

  • pip install together

Parameters:

  • prompt (str) –

    A string providing context for the model to complete.

  • model (str) –

    The name of the model to query.

For details on the other parameters, see: https://docs.together.ai/reference/completions-1

Returns:

  • JsonT

    A dictionary containing the response and other metadata.

Examples:

Add a computed column that applies the model mistralai/Mixtral-8x7B-v0.1 to an existing Pixeltable column tbl.prompt of the table tbl:

>>> tbl['response'] = completions(tbl.prompt, model='mistralai/Mixtral-8x7B-v0.1')

embeddings

embeddings(input: str, *, model: str) -> ArrayT

Query an embedding model for a given string of text.

Equivalent to the Together AI embeddings API endpoint. For additional details, see: https://docs.together.ai/reference/embeddings-2

Requirements:

  • pip install together

Parameters:

  • input (str) –

    A string providing the text for the model to embed.

  • model (str) –

    The name of the embedding model to use.

Returns:

  • ArrayT

    An array representing the application of the given embedding to input.

Examples:

Add a computed column that applies the model togethercomputer/m2-bert-80M-8k-retrieval to an existing Pixeltable column tbl.text of the table tbl:

>>> tbl['response'] = embeddings(tbl.text, model='togethercomputer/m2-bert-80M-8k-retrieval')

image_generations

image_generations(
    prompt: str,
    *,
    model: str,
    steps: Optional[int] = None,
    seed: Optional[int] = None,
    height: Optional[int] = None,
    width: Optional[int] = None,
    negative_prompt: Optional[str] = None
) -> ImageT

Generate images based on a given prompt using a specified model.

Equivalent to the Together AI images/generations API endpoint. For additional details, see: https://docs.together.ai/reference/post_images-generations

Requirements:

  • pip install together

Parameters:

  • prompt (str) –

    A description of the desired images.

  • model (str) –

    The model to use for image generation.

For details on the other parameters, see: https://docs.together.ai/reference/post_images-generations

Returns:

  • ImageT

    The generated image.

Examples:

Add a computed column that applies the model runwayml/stable-diffusion-v1-5 to an existing Pixeltable column tbl.prompt of the table tbl:

>>> tbl['response'] = image_generations(tbl.prompt, model='runwayml/stable-diffusion-v1-5')