Skip to content

how can i use local model #811

@GXKIM

Description

@GXKIM

I have started using a model with vllm, and I want to use the web-based Langsmith in conjunction with my local model. The code is as follows. I’m curious about how I should customize the model and then use it on the web page.

###code
llm = ChatOpenAI(model_name = 'Qwen2.5-14B-Instruct', base_url = "http://xxx:9009/v1", api_key = "EMPTY", temperature=0).bind(response_format={"type": "json_object"})

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions