Skip to content

vllm.entrypoints.serve.tokenize.protocol

TokenizerInfoResponse

Bases: OpenAIBaseModel

Response containing tokenizer configuration equivalent to tokenizer_config.json

Source code in vllm/entrypoints/serve/tokenize/protocol.py
class TokenizerInfoResponse(OpenAIBaseModel):
    """
    Response containing tokenizer configuration
    equivalent to tokenizer_config.json
    """

    model_config = ConfigDict(extra="allow")
    tokenizer_class: str