Commit 0dcff08
committed
fix(pt): recognize AOTInductor-wrapped CUDA OOM in AutoBatchSize
When running `dp --pt-expt test` (or any path that goes through
`deepmd.pt_expt.infer.deep_eval`) against a `.pt2` AOTInductor
package, `AutoBatchSize` doubles the batch on every success. For
models with a large `sel` the exploration eventually saturates GPU
memory, and the CUDA caching allocator raises the usual
``CUDA out of memory`` from inside the AOTInductor runtime.
AOTInductor then rewraps that error as a generic
RuntimeError: run_func_(...) API call failed at
.../aoti_runner/model_container_runner.cpp, line 144
The original "CUDA out of memory" text is printed only to stderr,
so the old `is_oom_error` -- which keyed on a short list of
substrings in `e.args[0]` -- never matched. `execute()` therefore
did not shrink the batch; the exception propagated and the run
crashed on a GPU that was otherwise completely idle (as confirmed by
monitoring `nvidia-smi --query-compute-apps`, which showed dp itself
as the sole consumer holding tens of GiB just before the failure).
Widen `is_oom_error` to:
* walk the exception chain via `__cause__` / `__context__`, so that a
future PyTorch preserving the original OOM text is handled for free;
* keep matching the four plain CUDA OOM markers on every message in
the chain;
* additionally treat the AOTInductor wrapper signature
(`run_func_(` plus `model_container_runner`) as an OOM candidate.
If the AOTInductor wrapper ever hides a non-OOM failure, the batch
shrinker will halve down to 1 and then raise `OutOfMemoryError`, so
the fallback is bounded -- non-OOM bugs still surface with a clear
terminal error rather than being silently retried forever.1 parent 54f42d9 commit 0dcff08
1 file changed
Lines changed: 47 additions & 14 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
46 | 46 | | |
47 | 47 | | |
48 | 48 | | |
49 | | - | |
50 | | - | |
51 | | - | |
52 | | - | |
53 | | - | |
54 | | - | |
55 | | - | |
56 | | - | |
57 | | - | |
58 | | - | |
59 | | - | |
60 | | - | |
61 | | - | |
62 | | - | |
| 49 | + | |
63 | 50 | | |
64 | 51 | | |
| 52 | + | |
| 53 | + | |
| 54 | + | |
| 55 | + | |
| 56 | + | |
| 57 | + | |
| 58 | + | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
| 63 | + | |
| 64 | + | |
| 65 | + | |
| 66 | + | |
| 67 | + | |
| 68 | + | |
| 69 | + | |
| 70 | + | |
| 71 | + | |
| 72 | + | |
| 73 | + | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
| 78 | + | |
| 79 | + | |
| 80 | + | |
| 81 | + | |
| 82 | + | |
| 83 | + | |
| 84 | + | |
| 85 | + | |
| 86 | + | |
| 87 | + | |
| 88 | + | |
| 89 | + | |
| 90 | + | |
| 91 | + | |
| 92 | + | |
| 93 | + | |
| 94 | + | |
| 95 | + | |
| 96 | + | |
| 97 | + | |
65 | 98 | | |
0 commit comments