Skip to content

Actions: EmbeddedLLM/vllm

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
403 workflow runs
403 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Close inactive issues and PRs
Close inactive issues and PRs #234: Scheduled
June 23, 2025 02:16 8s main
June 23, 2025 02:16 8s
Close inactive issues and PRs
Close inactive issues and PRs #233: Scheduled
June 22, 2025 02:16 8s main
June 22, 2025 02:16 8s
Fix: Check the type of params to be a Sequence not list. (#19910)
pre-commit #132: Commit 8ca81bb pushed by tjtanaa
June 21, 2025 02:43 6m 7s main
June 21, 2025 02:43 6m 7s
Close inactive issues and PRs
Close inactive issues and PRs #232: Scheduled
June 21, 2025 02:14 11s main
June 21, 2025 02:14 11s
[Bugfix][Ray] Set the cuda context eagerly in the ray worker (#19583)
pre-commit #131: Commit 5e666f7 pushed by vllmellm
June 20, 2025 07:05 6m 26s main
June 20, 2025 07:05 6m 26s
Close inactive issues and PRs
Close inactive issues and PRs #231: Scheduled
June 20, 2025 02:15 10s main
June 20, 2025 02:15 10s
[Doc] Update V1 user guide for embedding models (#19842)
pre-commit #130: Commit 6f68c49 pushed by tjtanaa
June 19, 2025 15:41 6m 20s main
June 19, 2025 15:41 6m 20s
[Frontend] Add optional token-level progress bar to LLM.beam_search
pre-commit #129: Commit 466166d pushed by tjtanaa
June 19, 2025 08:03 7m 8s main
June 19, 2025 08:03 7m 8s
Close inactive issues and PRs
Close inactive issues and PRs #230: Scheduled
June 19, 2025 02:15 10s main
June 19, 2025 02:15 10s
Close inactive issues and PRs
Close inactive issues and PRs #229: Scheduled
June 18, 2025 02:15 8s main
June 18, 2025 02:15 8s
Close inactive issues and PRs
Close inactive issues and PRs #228: Scheduled
June 17, 2025 02:15 9s main
June 17, 2025 02:15 9s
Close inactive issues and PRs
Close inactive issues and PRs #227: Scheduled
June 16, 2025 02:16 11s main
June 16, 2025 02:16 11s
Close inactive issues and PRs
Close inactive issues and PRs #226: Scheduled
June 15, 2025 02:16 8s main
June 15, 2025 02:16 8s
Close inactive issues and PRs
Close inactive issues and PRs #225: Scheduled
June 14, 2025 02:12 9s main
June 14, 2025 02:12 9s
Close inactive issues and PRs
Close inactive issues and PRs #224: Scheduled
June 13, 2025 03:07 8s main
June 13, 2025 03:07 8s
Close inactive issues and PRs
Close inactive issues and PRs #223: Scheduled
June 12, 2025 03:06 8s main
June 12, 2025 03:06 8s
Close inactive issues and PRs
Close inactive issues and PRs #222: Scheduled
June 11, 2025 03:07 8s main
June 11, 2025 03:07 8s
Use xla flag to improve the quantized model performance (#19303)
pre-commit #128: Commit 9af6d22 pushed by vllmellm
June 10, 2025 03:40 6m 11s main
June 10, 2025 03:40 6m 11s
Close inactive issues and PRs
Close inactive issues and PRs #221: Scheduled
June 10, 2025 03:08 9s main
June 10, 2025 03:08 9s
[Frontend] Remove unreachable code from llm.py (#19288)
pre-commit #127: Commit 8335667 pushed by vllmellm
June 9, 2025 04:19 6m 11s main
June 9, 2025 04:19 6m 11s
Close inactive issues and PRs
Close inactive issues and PRs #220: Scheduled
June 9, 2025 03:11 10s main
June 9, 2025 03:11 10s
Close inactive issues and PRs
Close inactive issues and PRs #219: Scheduled
June 8, 2025 03:12 13s main
June 8, 2025 03:12 13s
Close inactive issues and PRs
Close inactive issues and PRs #218: Scheduled
June 7, 2025 02:13 10s main
June 7, 2025 02:13 10s
[Bugfix] Fix EAGLE vocab embedding construction for Llama 70B (#19033)
pre-commit #126: Commit 3465b87 pushed by vllmellm
June 6, 2025 02:20 6m 56s main
June 6, 2025 02:20 6m 56s
Close inactive issues and PRs
Close inactive issues and PRs #217: Scheduled
June 6, 2025 02:14 12s main
June 6, 2025 02:14 12s