maxkb/apps/setting/models_provider/impl/vllm_model_provider/model
CaptainB 3c6b65baa1 fix: Remove vllm image cache
--bug=1052365 --user=刘瑞斌 【github#2353】vllm视觉模型修改最大tokens不生效 https://www.tapd.cn/57709429/s/1657667
2025-02-24 16:30:13 +08:00
..
embedding.py feat: Support vllm embedding model 2025-01-20 16:24:56 +08:00
image.py fix: Remove vllm image cache 2025-02-24 16:30:13 +08:00
llm.py fix: tti model (#2060) 2025-01-21 17:50:01 +08:00