Skip to content

为什么DeepSeek-R1-Distill-Qwen-7B在经历过LLaMA-Factory的微调训练,却无法使用FastApi 部署调用了???? #399

Open
@lhhssd

Description

@lhhssd

大佬,为什么DeepSeek-R1-Distill-Qwen-7B在经历过LLaMA-Factory的微调训练,却无法使用FastApi 部署调用了????

Image

错误信息:root@autodl-container-01eb4295e8-d352b5ee:# python 1.py
Traceback (most recent call last):
File "/root/1.py", line 79, in
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=False)
File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 934, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/root/miniconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2036, in from_pretrained
return cls._from_pretrained(
File "/root/miniconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2276, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py", line 169, in init
self.sp_model = self.get_spm_processor(kwargs.pop("from_slow", False))
File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py", line 196, in get_spm_processor
tokenizer.Load(self.vocab_file)
File "/root/miniconda3/lib/python3.10/site-packages/sentencepiece/init.py", line 961, in Load
return self.LoadFromFile(model_file)
File "/root/miniconda3/lib/python3.10/site-packages/sentencepiece/init.py", line 316, in LoadFromFile
return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
TypeError: not a string
root@autodl-container-01eb4295e8-d352b5ee:
#

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions