2025-09-07T07:37:53.8514202Z Current runner version: '2.328.0' 2025-09-07T07:37:53.8518501Z Runner name: 'i-081e6be8c4291059d' 2025-09-07T07:37:53.8519213Z Runner group name: 'default' 2025-09-07T07:37:53.8519884Z Machine name: 'ip-10-0-37-56' 2025-09-07T07:37:53.8521765Z ##[group]GITHUB_TOKEN Permissions 2025-09-07T07:37:53.8523425Z Contents: read 2025-09-07T07:37:53.8523806Z Metadata: read 2025-09-07T07:37:53.8524183Z ##[endgroup] 2025-09-07T07:37:53.8525811Z Secret source: Actions 2025-09-07T07:37:53.8526391Z Prepare workflow directory 2025-09-07T07:37:53.8870075Z Prepare all required actions 2025-09-07T07:37:53.8900224Z Getting action download info 2025-09-07T07:37:54.1874413Z Download action repository 'pytorch/test-infra@main' (SHA:548a4bc624d43a01cdf165a63b041f0ae014ddbd) 2025-09-07T07:37:55.3138065Z Download action repository 'pytorch/pytorch@main' (SHA:ada43ed39c80b746b4822c92640a1882619e2795) 2025-09-07T07:38:05.9948315Z Download action repository 'actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065' (SHA:a26af69be951a213d495a4c3e4e4022e16d87065) 2025-09-07T07:38:06.2973231Z Download action repository 'aws-actions/configure-aws-credentials@ececac1a45f3b08a01d2dd070d28d111c5fe6722' (SHA:ececac1a45f3b08a01d2dd070d28d111c5fe6722) 2025-09-07T07:38:06.4939447Z Download action repository 'aws-actions/amazon-ecr-login@062b18b96a7aff071d4dc91bc00c4c1a7945b076' (SHA:062b18b96a7aff071d4dc91bc00c4c1a7945b076) 2025-09-07T07:38:06.6403742Z Download action repository 'seemethere/upload-artifact-s3@baba72d0712b404f646cebe0730933554ebce96a' (SHA:baba72d0712b404f646cebe0730933554ebce96a) 2025-09-07T07:38:06.8728409Z Getting action download info 2025-09-07T07:38:06.9625332Z Download action repository 'actions/checkout@v4' (SHA:08eba0b27e820071cde6df949e0beb9ba4906955) 2025-09-07T07:38:07.1672388Z Getting action download info 2025-09-07T07:38:07.2747898Z Download action repository 'nick-fields/retry@v3.0.0' (SHA:7152eba30c6575329ac0576536151aca5a72780e) 2025-09-07T07:38:07.4077602Z Getting action download info 2025-09-07T07:38:07.5161772Z Download action repository 'nick-fields/retry@3e91a01664abd3c5cd539100d10d33b9c5b68482' (SHA:3e91a01664abd3c5cd539100d10d33b9c5b68482) 2025-09-07T07:38:07.6589461Z Getting action download info 2025-09-07T07:38:07.7819138Z Uses: pytorch/pytorch/.github/workflows/_linux-test.yml@refs/heads/main (93fb23d6fae7c4e82c4239a1033e522088742634) 2025-09-07T07:38:07.7821969Z ##[group] Inputs 2025-09-07T07:38:07.7822266Z build-environment: linux-jammy-py3.9-gcc11-build 2025-09-07T07:38:07.7824601Z test-matrix: {"include": [{"config": "inductor_huggingface_perf_cpu_x86", "shard": 1, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 2, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 3, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 1, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 2, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 3, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 4, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 5, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 1, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 2, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 3, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 4, "num_shards": 4, "runner": "linux.24xl.spr-metal"}]} 2025-09-07T07:38:07.7827469Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:38:07.7828116Z sync-tag: 2025-09-07T07:38:07.7828697Z timeout-minutes: 720 2025-09-07T07:38:07.7828932Z use-gha: 2025-09-07T07:38:07.7829440Z dashboard-tag: training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true 2025-09-07T07:38:07.7829946Z s3-bucket: gha-artifacts 2025-09-07T07:38:07.7830162Z aws-role-to-assume: 2025-09-07T07:38:07.7830635Z disable-monitor: false 2025-09-07T07:38:07.7830892Z monitor-log-interval: 15 2025-09-07T07:38:07.7831133Z monitor-data-collect-interval: 4 2025-09-07T07:38:07.7831421Z ##[endgroup] 2025-09-07T07:38:07.7831819Z Complete job name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:38:07.8613645Z A job started hook has been configured by the self-hosted runner administrator 2025-09-07T07:38:07.8678580Z ##[group]Run '/home/ec2-user/runner-scripts/before_job.sh' 2025-09-07T07:38:07.8684693Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:38:07.8685094Z ##[endgroup] 2025-09-07T07:38:08.8051923Z Runner Type: linux.24xl.spr-metal 2025-09-07T07:38:08.8052272Z Instance Type: c7i.metal-24xl 2025-09-07T07:38:08.8052564Z AMI Name: unknown 2025-09-07T07:38:08.8075601Z AMI ID: ami-05ffe3c48a9991133 2025-09-07T07:38:12.4638699Z ##[group]Run pytorch/test-infra/.github/actions/setup-ssh@main 2025-09-07T07:38:12.4639002Z with: 2025-09-07T07:38:12.4639519Z github-secret: *** 2025-09-07T07:38:12.4639949Z instructions: All testing is done inside the container, to start an interactive session run: docker exec -it $(docker container ps --format '{{.ID}}') bash 2025-09-07T07:38:12.4640386Z activate-with-label: false 2025-09-07T07:38:12.4640573Z label: with-ssh 2025-09-07T07:38:12.4640745Z remove-existing-keys: true 2025-09-07T07:38:12.4640914Z fail-silently: true 2025-09-07T07:38:12.4641093Z env: 2025-09-07T07:38:12.4641237Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:38:12.4641411Z ##[endgroup] 2025-09-07T07:38:12.5684480Z Please see https://github.com/pytorch/pytorch/wiki/Debugging-using-with-ssh-for-Github-Actions for more info. 2025-09-07T07:38:12.5685036Z Not on pull request and ciflow reference could not be extracted, skipping adding ssh keys 2025-09-07T07:38:12.5816085Z ##[group]Run pytorch/pytorch/.github/actions/checkout-pytorch@main 2025-09-07T07:38:12.5816357Z with: 2025-09-07T07:38:12.5816514Z no-sudo: true 2025-09-07T07:38:12.5816676Z submodules: recursive 2025-09-07T07:38:12.5816843Z fetch-depth: 0 2025-09-07T07:38:12.5816998Z env: 2025-09-07T07:38:12.5817144Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:38:12.5817308Z ##[endgroup] 2025-09-07T07:38:12.5873106Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:38:12.5873657Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:38:12.5880504Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:38:12.5880753Z env: 2025-09-07T07:38:12.5881002Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:38:12.5881226Z ##[endgroup] 2025-09-07T07:38:12.5943200Z ##[group]Run # Use all available CPUs for fetching 2025-09-07T07:38:12.5943486Z # Use all available CPUs for fetching 2025-09-07T07:38:12.5943701Z cd "${GITHUB_WORKSPACE}" 2025-09-07T07:38:12.5943906Z git config --global fetch.parallel 0 2025-09-07T07:38:12.5944137Z git config --global submodule.fetchJobs 0 2025-09-07T07:38:12.5944341Z  2025-09-07T07:38:12.5944566Z # Clean workspace. The default checkout action should also do this, but 2025-09-07T07:38:12.5944830Z # do it here as well just in case 2025-09-07T07:38:12.5945027Z if [[ -d .git ]]; then 2025-09-07T07:38:12.5945222Z  if [ -z "${NO_SUDO}" ]; then 2025-09-07T07:38:12.5945413Z  sudo git clean -ffdx 2025-09-07T07:38:12.5945582Z  else 2025-09-07T07:38:12.5945848Z  git clean -ffdx 2025-09-07T07:38:12.5946015Z  fi 2025-09-07T07:38:12.5946162Z fi 2025-09-07T07:38:12.5949797Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:38:12.5950024Z env: 2025-09-07T07:38:12.5950171Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:38:12.5950342Z NO_SUDO: true 2025-09-07T07:38:12.5950484Z ##[endgroup] 2025-09-07T07:38:12.6045116Z ##[group]Run actions/checkout@v4 2025-09-07T07:38:12.6045339Z with: 2025-09-07T07:38:12.6045532Z ref: 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:38:12.6045767Z fetch-depth: 0 2025-09-07T07:38:12.6045944Z submodules: recursive 2025-09-07T07:38:12.6046138Z show-progress: false 2025-09-07T07:38:12.6046328Z repository: pytorch/pytorch 2025-09-07T07:38:12.6046696Z token: *** 2025-09-07T07:38:12.6046872Z ssh-strict: true 2025-09-07T07:38:12.6047047Z ssh-user: git 2025-09-07T07:38:12.6047230Z persist-credentials: true 2025-09-07T07:38:12.6047425Z clean: true 2025-09-07T07:38:12.6047611Z sparse-checkout-cone-mode: true 2025-09-07T07:38:12.6047833Z fetch-tags: false 2025-09-07T07:38:12.6048009Z lfs: false 2025-09-07T07:38:12.6048181Z set-safe-directory: true 2025-09-07T07:38:12.6048376Z env: 2025-09-07T07:38:12.6048529Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:38:12.6048711Z ##[endgroup] 2025-09-07T07:38:12.6859124Z Syncing repository: pytorch/pytorch 2025-09-07T07:38:12.6860098Z ##[group]Getting Git version info 2025-09-07T07:38:12.6860408Z Working directory is '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-09-07T07:38:12.6860849Z [command]/usr/bin/git version 2025-09-07T07:38:12.7055494Z git version 2.47.1 2025-09-07T07:38:12.7072163Z ##[endgroup] 2025-09-07T07:38:12.7081043Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/aa15d88a-2bfb-4a6a-873e-066283b008b4/.gitconfig' 2025-09-07T07:38:12.7098827Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/aa15d88a-2bfb-4a6a-873e-066283b008b4' before making global git config changes 2025-09-07T07:38:12.7099428Z Adding repository directory to the temporary git global config as a safe directory 2025-09-07T07:38:12.7102304Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-09-07T07:38:12.7136490Z Deleting the contents of '/home/ec2-user/actions-runner/_work/pytorch/pytorch' 2025-09-07T07:38:12.7138771Z ##[group]Initializing the repository 2025-09-07T07:38:12.7148248Z [command]/usr/bin/git init /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-09-07T07:38:12.7193160Z hint: Using 'master' as the name for the initial branch. This default branch name 2025-09-07T07:38:12.7193509Z hint: is subject to change. To configure the initial branch name to use in all 2025-09-07T07:38:12.7193842Z hint: of your new repositories, which will suppress this warning, call: 2025-09-07T07:38:12.7194090Z hint: 2025-09-07T07:38:12.7194302Z hint: git config --global init.defaultBranch 2025-09-07T07:38:12.7194525Z hint: 2025-09-07T07:38:12.7194745Z hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and 2025-09-07T07:38:12.7195078Z hint: 'development'. The just-created branch can be renamed via this command: 2025-09-07T07:38:12.7195328Z hint: 2025-09-07T07:38:12.7195482Z hint: git branch -m 2025-09-07T07:38:12.7225559Z Initialized empty Git repository in /home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/ 2025-09-07T07:38:12.7231647Z [command]/usr/bin/git remote add origin https://github.com/pytorch/pytorch 2025-09-07T07:38:12.7261043Z ##[endgroup] 2025-09-07T07:38:12.7261346Z ##[group]Disabling automatic garbage collection 2025-09-07T07:38:12.7264529Z [command]/usr/bin/git config --local gc.auto 0 2025-09-07T07:38:12.7283475Z ##[endgroup] 2025-09-07T07:38:12.7283755Z ##[group]Setting up auth 2025-09-07T07:38:12.7288386Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-09-07T07:38:12.7307004Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-09-07T07:38:12.7565651Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-09-07T07:38:12.7585364Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-09-07T07:38:12.7851929Z [command]/usr/bin/git config --local http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-09-07T07:38:12.7894811Z ##[endgroup] 2025-09-07T07:38:12.7895135Z ##[group]Fetching the repository 2025-09-07T07:38:12.7900246Z [command]/usr/bin/git -c protocol.version=2 fetch --prune --no-recurse-submodules origin +refs/heads/*:refs/remotes/origin/* +refs/tags/*:refs/tags/* 2025-09-07T07:38:46.2587659Z From https://github.com/pytorch/pytorch 2025-09-07T07:38:46.2588061Z * [new branch] 160583 -> origin/160583 2025-09-07T07:38:46.2588369Z * [new branch] 2.6.0.dev20241004+ -> origin/2.6.0.dev20241004+ 2025-09-07T07:38:46.2588816Z * [new branch] 5addvllmbuild -> origin/5addvllmbuild 2025-09-07T07:38:46.2589154Z * [new branch] AaronWang04_addmmfusion_perftest -> origin/AaronWang04_addmmfusion_perftest 2025-09-07T07:38:46.2589552Z * [new branch] HDCharles-2.6.0-release-notes -> origin/HDCharles-2.6.0-release-notes 2025-09-07T07:38:46.2589886Z * [new branch] ISSUE-154849 -> origin/ISSUE-154849 2025-09-07T07:38:46.2590251Z * [new branch] JackCaoG/dynamo_make_fx_non_core_aten_ops -> origin/JackCaoG/dynamo_make_fx_non_core_aten_ops 2025-09-07T07:38:46.2590624Z * [new branch] NicoshevSVE128 -> origin/NicoshevSVE128 2025-09-07T07:38:46.2590938Z * [new branch] PR-AOTInductorNoneBug -> origin/PR-AOTInductorNoneBug 2025-09-07T07:38:46.2591313Z * [new branch] PR-AOTInductorNoneBugFix -> origin/PR-AOTInductorNoneBugFix 2025-09-07T07:38:46.2594248Z * [new branch] PR-FixConfigsIssue -> origin/PR-FixConfigsIssue 2025-09-07T07:38:46.2594647Z * [new branch] PR-NoneBugFix-viable -> origin/PR-NoneBugFix-viable 2025-09-07T07:38:46.2594954Z * [new branch] PR-ResetToZero -> origin/PR-ResetToZero 2025-09-07T07:38:46.2595265Z * [new branch] Update-Flash-Packaging -> origin/Update-Flash-Packaging 2025-09-07T07:38:46.2595580Z * [new branch] VLA_exp -> origin/VLA_exp 2025-09-07T07:38:46.2595907Z * [new branch] actually-run-mps-aot-inductor -> origin/actually-run-mps-aot-inductor 2025-09-07T07:38:46.2596305Z * [new branch] add-missing-args-normalization -> origin/add-missing-args-normalization 2025-09-07T07:38:46.2596681Z * [new branch] add-user-guide-structure -> origin/add-user-guide-structure 2025-09-07T07:38:46.2597017Z * [new branch] add-vllm-nightly-build -> origin/add-vllm-nightly-build 2025-09-07T07:38:46.2597347Z * [new branch] add_compile_benchmarking -> origin/add_compile_benchmarking 2025-09-07T07:38:46.2597663Z * [new branch] addmm-heuristic -> origin/addmm-heuristic 2025-09-07T07:38:46.2597957Z * [new branch] addsimde -> origin/addsimde 2025-09-07T07:38:46.2598223Z * [new branch] addvllmtest -> origin/addvllmtest 2025-09-07T07:38:46.2598495Z * [new branch] adi/acl_upgrade -> origin/adi/acl_upgrade 2025-09-07T07:38:46.2598778Z * [new branch] adi/test -> origin/adi/test 2025-09-07T07:38:46.2599596Z * [new branch] adi/test_bgemm -> origin/adi/test_bgemm 2025-09-07T07:38:46.2599950Z * [new branch] adi/test_fusions -> origin/adi/test_fusions 2025-09-07T07:38:46.2600504Z * [new branch] adi/test_onednn_v3.9 -> origin/adi/test_onednn_v3.9 2025-09-07T07:38:46.2601074Z * [new branch] adi/test_presve_change -> origin/adi/test_presve_change 2025-09-07T07:38:46.2601374Z * [new branch] adi/test_timm -> origin/adi/test_timm 2025-09-07T07:38:46.2602204Z * [new branch] adi/testpresve_change -> origin/adi/testpresve_change 2025-09-07T07:38:46.2603394Z * [new branch] aditew01/test/vec_bf16 -> origin/aditew01/test/vec_bf16 2025-09-07T07:38:46.2603957Z * [new branch] ah-globalfeedback-hook -> origin/ah-globalfeedback-hook 2025-09-07T07:38:46.2604550Z * [new branch] alt-disable -> origin/alt-disable 2025-09-07T07:38:46.2605570Z * [new branch] angelayi/aoti_additional_files -> origin/angelayi/aoti_additional_files 2025-09-07T07:38:46.2606077Z * [new branch] angelayi/aoti_inductor_fx -> origin/angelayi/aoti_inductor_fx 2025-09-07T07:38:46.2606640Z * [new branch] angelayi/benchmark -> origin/angelayi/benchmark 2025-09-07T07:38:46.2607242Z * [new branch] angelayi/benchmark2 -> origin/angelayi/benchmark2 2025-09-07T07:38:46.2608129Z * [new branch] angelayi/change_pytree_serialization -> origin/angelayi/change_pytree_serialization 2025-09-07T07:38:46.2608588Z * [new branch] angelayi/cpp_loader -> origin/angelayi/cpp_loader 2025-09-07T07:38:46.2609414Z * [new branch] angelayi/custom_op_subgraph -> origin/angelayi/custom_op_subgraph 2025-09-07T07:38:46.2610000Z * [new branch] angelayi/customop -> origin/angelayi/customop 2025-09-07T07:38:46.2610836Z * [new branch] angelayi/fake_cache_empty -> origin/angelayi/fake_cache_empty 2025-09-07T07:38:46.2611334Z * [new branch] angelayi/is_symbolic_tracing -> origin/angelayi/is_symbolic_tracing 2025-09-07T07:38:46.2611834Z * [new branch] angelayi/item -> origin/angelayi/item 2025-09-07T07:38:46.2612462Z * [new branch] angelayi/no_so_weight -> origin/angelayi/no_so_weight 2025-09-07T07:38:46.2612962Z * [new branch] angelayi/opoverload -> origin/angelayi/opoverload 2025-09-07T07:38:46.2613522Z * [new branch] angelayi/pattern -> origin/angelayi/pattern 2025-09-07T07:38:46.2614092Z * [new branch] angelayi/pytree -> origin/angelayi/pytree 2025-09-07T07:38:46.2614625Z * [new branch] angelayi/scan_layers -> origin/angelayi/scan_layers 2025-09-07T07:38:46.2615184Z * [new branch] angelayi/symint_input -> origin/angelayi/symint_input 2025-09-07T07:38:46.2615742Z * [new branch] angelayi/test_cpp -> origin/angelayi/test_cpp 2025-09-07T07:38:46.2616279Z * [new branch] angelayi/torch_size -> origin/angelayi/torch_size 2025-09-07T07:38:46.2617134Z * [new branch] aoti-cuda-alloc -> origin/aoti-cuda-alloc 2025-09-07T07:38:46.2617587Z * [new branch] aoti_target_windows -> origin/aoti_target_windows 2025-09-07T07:38:46.2618081Z * [new branch] aoti_weight_sharing -> origin/aoti_weight_sharing 2025-09-07T07:38:46.2618783Z * [new branch] atalman-inductor-perf-cu124 -> origin/atalman-inductor-perf-cu124 2025-09-07T07:38:46.2619311Z * [new branch] atalman-inductor-perf-cu124.1 -> origin/atalman-inductor-perf-cu124.1 2025-09-07T07:38:46.2619914Z * [new branch] atalman-patch-1 -> origin/atalman-patch-1 2025-09-07T07:38:46.2620560Z * [new branch] atalman-patch-3 -> origin/atalman-patch-3 2025-09-07T07:38:46.2621136Z * [new branch] atalman-patch-4 -> origin/atalman-patch-4 2025-09-07T07:38:46.2621762Z * [new branch] atalman-patch-5 -> origin/atalman-patch-5 2025-09-07T07:38:46.2622387Z * [new branch] atalman-patch-6 -> origin/atalman-patch-6 2025-09-07T07:38:46.2623216Z * [new branch] atalman_inductor_2.3.0 -> origin/atalman_inductor_2.3.0 2025-09-07T07:38:46.2623601Z * [new branch] atalman_inductor_2.3.1 -> origin/atalman_inductor_2.3.1 2025-09-07T07:38:46.2624149Z * [new branch] atalman_inductor_2.4.0 -> origin/atalman_inductor_2.4.0 2025-09-07T07:38:46.2624776Z * [new branch] atalman_inductor_2.4.x -> origin/atalman_inductor_2.4.x 2025-09-07T07:38:46.2625413Z * [new branch] autoupdate-transformers-pin-via-pr -> origin/autoupdate-transformers-pin-via-pr 2025-09-07T07:38:46.2626487Z * [new branch] bahuang/dtensor_demo -> origin/bahuang/dtensor_demo 2025-09-07T07:38:46.2626963Z * [new branch] bahuang/test -> origin/bahuang/test 2025-09-07T07:38:46.2628077Z * [new branch] base/1.5 -> origin/base/1.5 2025-09-07T07:38:46.2628707Z * [new branch] batching_sdpa_efficient_attention -> origin/batching_sdpa_efficient_attention 2025-09-07T07:38:46.2629232Z * [new branch] bc-lint-config -> origin/bc-lint-config 2025-09-07T07:38:46.2629796Z * [new branch] bc-lint-test-new-config -> origin/bc-lint-test-new-config 2025-09-07T07:38:46.2630415Z * [new branch] benchmark-updates -> origin/benchmark-updates 2025-09-07T07:38:46.2631036Z * [new branch] benchmarker_compat_with_do_bench -> origin/benchmarker_compat_with_do_bench 2025-09-07T07:38:46.2631589Z * [new branch] benchmarking-script -> origin/benchmarking-script 2025-09-07T07:38:46.2632559Z * [new branch] bertmaher/pinbump26 -> origin/bertmaher/pinbump26 2025-09-07T07:38:46.2633431Z * [new branch] bertrand/cutlass -> origin/bertrand/cutlass 2025-09-07T07:38:46.2634300Z * [new branch] bf/cg-custom-wrapper -> origin/bf/cg-custom-wrapper 2025-09-07T07:38:46.2635019Z * [new branch] bf/cg-or-error -> origin/bf/cg-or-error 2025-09-07T07:38:46.2635323Z * [new branch] bf/cg-remove-check -> origin/bf/cg-remove-check 2025-09-07T07:38:46.2635841Z * [new branch] bf/cg-skip-1-kernel -> origin/bf/cg-skip-1-kernel 2025-09-07T07:38:46.2636620Z * [new branch] bf/cudagraph -> origin/bf/cudagraph 2025-09-07T07:38:46.2637288Z * [new branch] bf/cudagraph-disable-input-mutation -> origin/bf/cudagraph-disable-input-mutation 2025-09-07T07:38:46.2638299Z * [new branch] bf/cudagraph-enable-input-mutation-support-benchmark -> origin/bf/cudagraph-enable-input-mutation-support-benchmark 2025-09-07T07:38:46.2638802Z * [new branch] bf/cudagraph-partition -> origin/bf/cudagraph-partition 2025-09-07T07:38:46.2639247Z * [new branch] bf/default-recompile-reason -> origin/bf/default-recompile-reason 2025-09-07T07:38:46.2639765Z * [new branch] bf/donated-buffer-bench -> origin/bf/donated-buffer-bench 2025-09-07T07:38:46.2640294Z * [new branch] bf/exp -> origin/bf/exp 2025-09-07T07:38:46.2640843Z * [new branch] bf/pa-non-divisible -> origin/bf/pa-non-divisible 2025-09-07T07:38:46.2641404Z * [new branch] bf/partition-move-cpu -> origin/bf/partition-move-cpu 2025-09-07T07:38:46.2641955Z * [new branch] bf/partition-turn-on -> origin/bf/partition-turn-on 2025-09-07T07:38:46.2642504Z * [new branch] bf/remove-check-55b0c39d -> origin/bf/remove-check-55b0c39d 2025-09-07T07:38:46.2642982Z * [new branch] bf/rope -> origin/bf/rope 2025-09-07T07:38:46.2643608Z * [new branch] bisect_perf_hf_T5_3acc6eac492 -> origin/bisect_perf_hf_T5_3acc6eac492 2025-09-07T07:38:46.2644118Z * [new branch] bisect_perf_hf_T5_3fcf66f61fb -> origin/bisect_perf_hf_T5_3fcf66f61fb 2025-09-07T07:38:46.2644643Z * [new branch] bisect_perf_hf_T5_4009d154129 -> origin/bisect_perf_hf_T5_4009d154129 2025-09-07T07:38:46.2645310Z * [new branch] bisect_perf_hf_T5_40d0740e73d -> origin/bisect_perf_hf_T5_40d0740e73d 2025-09-07T07:38:46.2645799Z * [new branch] bisect_perf_hf_T5_5268754e -> origin/bisect_perf_hf_T5_5268754e 2025-09-07T07:38:46.2646349Z * [new branch] bisect_perf_hf_T5_7d89a8d385c -> origin/bisect_perf_hf_T5_7d89a8d385c 2025-09-07T07:38:46.2646879Z * [new branch] bisect_perf_hf_T5_b7a25c1ee7c -> origin/bisect_perf_hf_T5_b7a25c1ee7c 2025-09-07T07:38:46.2647442Z * [new branch] bisect_perf_hf_T5_c25b201583f -> origin/bisect_perf_hf_T5_c25b201583f 2025-09-07T07:38:46.2647922Z * [new branch] bisect_perf_hf_T5_c93e57efac0 -> origin/bisect_perf_hf_T5_c93e57efac0 2025-09-07T07:38:46.2648490Z * [new branch] bisect_perf_hf_T5_ca9813ea149 -> origin/bisect_perf_hf_T5_ca9813ea149 2025-09-07T07:38:46.2648971Z * [new branch] bisect_perf_hf_T5_d65f194a -> origin/bisect_perf_hf_T5_d65f194a 2025-09-07T07:38:46.2649463Z * [new branch] bisect_perf_hf_T5_da94ab0b -> origin/bisect_perf_hf_T5_da94ab0b 2025-09-07T07:38:46.2649985Z * [new branch] bisect_perf_hf_T5_da94ab0b_new -> origin/bisect_perf_hf_T5_da94ab0b_new 2025-09-07T07:38:46.2650478Z * [new branch] bisect_perf_hf_T5_db4e8a1d8a8 -> origin/bisect_perf_hf_T5_db4e8a1d8a8 2025-09-07T07:38:46.2650980Z * [new branch] bisect_perf_hf_T5_e0d97e936a2 -> origin/bisect_perf_hf_T5_e0d97e936a2 2025-09-07T07:38:46.2651507Z * [new branch] bisect_perf_hf_T5_f23621ec563 -> origin/bisect_perf_hf_T5_f23621ec563 2025-09-07T07:38:46.2652437Z * [new branch] bowbao/bench_updates_stage -> origin/bowbao/bench_updates_stage 2025-09-07T07:38:46.2653047Z * [new branch] bowbao/dort_rewriter -> origin/bowbao/dort_rewriter 2025-09-07T07:38:46.2653588Z * [new branch] bowbao/wip_prs -> origin/bowbao/wip_prs 2025-09-07T07:38:46.2654479Z * [new branch] brister/break_tensorbox -> origin/brister/break_tensorbox 2025-09-07T07:38:46.2654968Z * [new branch] brister/custom_fx_backend -> origin/brister/custom_fx_backend 2025-09-07T07:38:46.2655492Z * [new branch] brister/fx_custom_triton -> origin/brister/fx_custom_triton 2025-09-07T07:38:46.2655973Z * [new branch] brister/tensor_box_output -> origin/brister/tensor_box_output 2025-09-07T07:38:46.2656593Z * [new branch] brister/tiled_reduction_no_numel_check -> origin/brister/tiled_reduction_no_numel_check 2025-09-07T07:38:46.2657208Z * [new branch] c57382a49 -> origin/c57382a49 2025-09-07T07:38:46.2657741Z * [new branch] ca_0431d47eaa -> origin/ca_0431d47eaa 2025-09-07T07:38:46.2658264Z * [new branch] ca_fix_0431d47eaa -> origin/ca_fix_0431d47eaa 2025-09-07T07:38:46.2659489Z * [new branch] camyll/revert-94bc900da97ad7f3c35b3b819bb53b23c74b581a-for-release-2.8 -> origin/camyll/revert-94bc900da97ad7f3c35b3b819bb53b23c74b581a-for-release-2.8 2025-09-07T07:38:46.2660070Z * [new branch] camyllh/test_setup_hooks_push -> origin/camyllh/test_setup_hooks_push 2025-09-07T07:38:46.2660697Z * [new branch] cherry-pick-149654-by-pytorch_bot_bot_ -> origin/cherry-pick-149654-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2661279Z * [new branch] cherry-pick-151939-by-pytorch_bot_bot_ -> origin/cherry-pick-151939-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2661947Z * [new branch] cherry-pick-154174-by-pytorch_bot_bot_ -> origin/cherry-pick-154174-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2662696Z * [new branch] cherry-pick-156260-by-pytorch_bot_bot_ -> origin/cherry-pick-156260-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2663263Z * [new branch] cherry-pick-157453-by-pytorch_bot_bot_ -> origin/cherry-pick-157453-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2663867Z * [new branch] cherry-pick-157513-by-pytorch_bot_bot_ -> origin/cherry-pick-157513-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2664448Z * [new branch] cherry-pick-157695-by-pytorch_bot_bot_ -> origin/cherry-pick-157695-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2664996Z * [new branch] cherry-pick-157732-by-pytorch_bot_bot_ -> origin/cherry-pick-157732-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2665563Z * [new branch] cherry-pick-158537-by-pytorch_bot_bot_ -> origin/cherry-pick-158537-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2666105Z * [new branch] cherry-pick-159969-by-pytorch_bot_bot_ -> origin/cherry-pick-159969-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2666700Z * [new branch] cherry-pick-160586-by-pytorch_bot_bot_ -> origin/cherry-pick-160586-by-pytorch_bot_bot_ 2025-09-07T07:38:46.2667559Z * [new branch] chilli/flex_vllm -> origin/chilli/flex_vllm 2025-09-07T07:38:46.2668139Z * [new branch] cleanup-inductor-benchmark-images -> origin/cleanup-inductor-benchmark-images 2025-09-07T07:38:46.2668630Z * [new branch] codex-testing -> origin/codex-testing 2025-09-07T07:38:46.2669755Z * [new branch] codex/add-helper-function-to-sizevars.py -> origin/codex/add-helper-function-to-sizevars.py 2025-09-07T07:38:46.2670294Z * [new branch] codex/add-helper-function-to-sizevars.py_2025-09-05 -> origin/codex/add-helper-function-to-sizevars.py_2025-09-05 2025-09-07T07:38:46.2670825Z * [new branch] codex/add-metadata-field-for-file-path -> origin/codex/add-metadata-field-for-file-path 2025-09-07T07:38:46.2671674Z * [new branch] codex/add-test-for-inductor-local-cache-behavior -> origin/codex/add-test-for-inductor-local-cache-behavior 2025-09-07T07:38:46.2672610Z * [new branch] codex/create-test-for-tensor-memory-leak-in-cudagraph -> origin/codex/create-test-for-tensor-memory-leak-in-cudagraph 2025-09-07T07:38:46.2673207Z * [new branch] codex/fix-issue-121219-in-pytorch -> origin/codex/fix-issue-121219-in-pytorch 2025-09-07T07:38:46.2673636Z * [new branch] codex/fix-issue-160415-in-pytorch -> origin/codex/fix-issue-160415-in-pytorch 2025-09-07T07:38:46.2674217Z * [new branch] codex/fix-noqengine-quantized-engine-support -> origin/codex/fix-noqengine-quantized-engine-support 2025-09-07T07:38:46.2674716Z * [new branch] codex/fix-pin_memory-error-handling -> origin/codex/fix-pin_memory-error-handling 2025-09-07T07:38:46.2675196Z * [new branch] codex/propose-fix-for-issue-160332 -> origin/codex/propose-fix-for-issue-160332 2025-09-07T07:38:46.2675845Z * [new branch] codex/refactor-lintrunner-config-to-use-uv-run -> origin/codex/refactor-lintrunner-config-to-use-uv-run 2025-09-07T07:38:46.2676522Z * [new branch] codex/remove-allow-untyped-defs-and-fix-type-errors -> origin/codex/remove-allow-untyped-defs-and-fix-type-errors 2025-09-07T07:38:46.2677046Z * [new branch] compile_fsdp2_disable_stream_and_event -> origin/compile_fsdp2_disable_stream_and_event 2025-09-07T07:38:46.2677404Z * [new branch] context_test -> origin/context_test 2025-09-07T07:38:46.2678253Z * [new branch] copilot/fix-157446 -> origin/copilot/fix-157446 2025-09-07T07:38:46.2678724Z * [new branch] copy_graph -> origin/copy_graph 2025-09-07T07:38:46.2679660Z * [new branch] cpio/fix_new_ami_tests -> origin/cpio/fix_new_ami_tests 2025-09-07T07:38:46.2680508Z * [new branch] csl/always_produce_xml -> origin/csl/always_produce_xml 2025-09-07T07:38:46.2681008Z * [new branch] csl/build_test_more_procs -> origin/csl/build_test_more_procs 2025-09-07T07:38:46.2681743Z * [new branch] csl/build_test_more_procs2 -> origin/csl/build_test_more_procs2 2025-09-07T07:38:46.2682265Z * [new branch] csl/disable_flaky_cpp_test -> origin/csl/disable_flaky_cpp_test 2025-09-07T07:38:46.2682776Z * [new branch] csl/disable_periodic_test -> origin/csl/disable_periodic_test 2025-09-07T07:38:46.2683474Z * [new branch] csl/exclude_rocm_viable_strict -> origin/csl/exclude_rocm_viable_strict 2025-09-07T07:38:46.2684296Z * [new branch] csl/katex -> origin/csl/katex 2025-09-07T07:38:46.2684706Z * [new branch] csl/larger_runner -> origin/csl/larger_runner 2025-09-07T07:38:46.2685235Z * [new branch] csl/lintrunner_stuff -> origin/csl/lintrunner_stuff 2025-09-07T07:38:46.2685740Z * [new branch] csl/mps_sharding -> origin/csl/mps_sharding 2025-09-07T07:38:46.2686271Z * [new branch] csl/multistage_docker -> origin/csl/multistage_docker 2025-09-07T07:38:46.2686794Z * [new branch] csl/name_link_check_job -> origin/csl/name_link_check_job 2025-09-07T07:38:46.2687299Z * [new branch] csl/no_keep_goin_rocm -> origin/csl/no_keep_goin_rocm 2025-09-07T07:38:46.2687865Z * [new branch] csl/not_600_timeout -> origin/csl/not_600_timeout 2025-09-07T07:38:46.2688428Z * [new branch] csl/revert_open -> origin/csl/revert_open 2025-09-07T07:38:46.2688938Z * [new branch] csl/skip_build -> origin/csl/skip_build 2025-09-07T07:38:46.2689527Z * [new branch] csl/test_cuda_build_large_runner -> origin/csl/test_cuda_build_large_runner 2025-09-07T07:38:46.2690340Z * [new branch] csl/win_sccache -> origin/csl/win_sccache 2025-09-07T07:38:46.2690737Z * [new branch] cublasltrelax2 -> origin/cublasltrelax2 2025-09-07T07:38:46.2691263Z * [new branch] cublasrelax2 -> origin/cublasrelax2 2025-09-07T07:38:46.2691849Z * [new branch] cudnnsdparefactor -> origin/cudnnsdparefactor 2025-09-07T07:38:46.2692359Z * [new branch] custom_lowering_dict -> origin/custom_lowering_dict 2025-09-07T07:38:46.2692870Z * [new branch] czhuge_muon_dev -> origin/czhuge_muon_dev 2025-09-07T07:38:46.2693815Z * [new branch] d4l3k/delete_hook -> origin/d4l3k/delete_hook 2025-09-07T07:38:46.2694277Z * [new branch] dcp_zoc -> origin/dcp_zoc 2025-09-07T07:38:46.2694835Z * [new branch] debug-guard -> origin/debug-guard 2025-09-07T07:38:46.2695425Z * [new branch] delete-quant-docs -> origin/delete-quant-docs 2025-09-07T07:38:46.2698211Z * [new branch] dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.2 -> origin/dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.2 2025-09-07T07:38:46.2698915Z * [new branch] dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.3 -> origin/dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.3 2025-09-07T07:38:46.2699608Z * [new branch] dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.4 -> origin/dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.55.4 2025-09-07T07:38:46.2700301Z * [new branch] dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.56.0 -> origin/dependabot/pip/dot-ci/docker/ci_commit_pins/main/transformers-4.56.0 2025-09-07T07:38:46.2700894Z * [new branch] dependabot/pip/dot-ci/docker/protobuf-5.29.5 -> origin/dependabot/pip/dot-ci/docker/protobuf-5.29.5 2025-09-07T07:38:46.2702093Z * [new branch] dependabot/pip/dot-github/requirements/protobuf-5.29.5 -> origin/dependabot/pip/dot-github/requirements/protobuf-5.29.5 2025-09-07T07:38:46.2702773Z * [new branch] desertfire/test_cpp_wrapper -> origin/desertfire/test_cpp_wrapper 2025-09-07T07:38:46.2703367Z * [new branch] desertfire/triton-cpu-for-aarch64 -> origin/desertfire/triton-cpu-for-aarch64 2025-09-07T07:38:46.2704816Z * [new branch] dev/joona/MPSNDArrayAdd -> origin/dev/joona/MPSNDArrayAdd 2025-09-07T07:38:46.2705411Z * [new branch] dev/joona/Unranked -> origin/dev/joona/Unranked 2025-09-07T07:38:46.2706267Z * [new branch] dev/joona/cat -> origin/dev/joona/cat 2025-09-07T07:38:46.2706853Z * [new branch] dev/joona/cat_remove_graph -> origin/dev/joona/cat_remove_graph 2025-09-07T07:38:46.2707406Z * [new branch] dev/joona/embeddingbag -> origin/dev/joona/embeddingbag 2025-09-07T07:38:46.2708235Z * [new branch] dev/joona/getTensorsString -> origin/dev/joona/getTensorsString 2025-09-07T07:38:46.2709128Z * [new branch] dev/joona/maxpool2dwithindices_errmsg -> origin/dev/joona/maxpool2dwithindices_errmsg 2025-09-07T07:38:46.2709888Z * [new branch] dev/joona/mps_linear_macos14 -> origin/dev/joona/mps_linear_macos14 2025-09-07T07:38:46.2710835Z * [new branch] dev/joona/sdpa -> origin/dev/joona/sdpa 2025-09-07T07:38:46.2711631Z * [new branch] dev/joona/topk_newapi -> origin/dev/joona/topk_newapi 2025-09-07T07:38:46.2712261Z * [new branch] dev/joona/type_inf -> origin/dev/joona/type_inf 2025-09-07T07:38:46.2713127Z * [new branch] dev/joona/upsize3d -> origin/dev/joona/upsize3d 2025-09-07T07:38:46.2713642Z * [new branch] disable -> origin/disable 2025-09-07T07:38:46.2714226Z * [new branch] e2e-baseline -> origin/e2e-baseline 2025-09-07T07:38:46.2714796Z * [new branch] eigen_for_sparse_addmm_v2 -> origin/eigen_for_sparse_addmm_v2 2025-09-07T07:38:46.2715739Z * [new branch] embg/test_inductor_ci_128B -> origin/embg/test_inductor_ci_128B 2025-09-07T07:38:46.2716274Z * [new branch] embg/test_inductor_ci_base -> origin/embg/test_inductor_ci_base 2025-09-07T07:38:46.2716805Z * [new branch] embg/test_inductor_ci_control -> origin/embg/test_inductor_ci_control 2025-09-07T07:38:46.2717336Z * [new branch] embg/triton_l2_prefetch_128B -> origin/embg/triton_l2_prefetch_128B 2025-09-07T07:38:46.2717967Z * [new branch] embg/triton_l2_prefetch_256B -> origin/embg/triton_l2_prefetch_256B 2025-09-07T07:38:46.2718598Z * [new branch] eqy-patch-1 -> origin/eqy-patch-1 2025-09-07T07:38:46.2719119Z * [new branch] eqy-patch-2 -> origin/eqy-patch-2 2025-09-07T07:38:46.2719628Z * [new branch] eqy-patch-3 -> origin/eqy-patch-3 2025-09-07T07:38:46.2720134Z * [new branch] eqy-patch-4 -> origin/eqy-patch-4 2025-09-07T07:38:46.2720711Z * [new branch] example-convert-torch.nn -> origin/example-convert-torch.nn 2025-09-07T07:38:46.2721760Z * [new branch] exclamaforte/add-contiguous-threshold -> origin/exclamaforte/add-contiguous-threshold 2025-09-07T07:38:46.2722308Z * [new branch] exclamaforte/amd-ma -> origin/exclamaforte/amd-ma 2025-09-07T07:38:46.2722885Z * [new branch] exclamaforte/bump-transformer-version -> origin/exclamaforte/bump-transformer-version 2025-09-07T07:38:46.2723438Z * [new branch] exclamaforte/clear-feedback-savers -> origin/exclamaforte/clear-feedback-savers 2025-09-07T07:38:46.2723989Z * [new branch] exclamaforte/combo-kernels-perf-run -> origin/exclamaforte/combo-kernels-perf-run 2025-09-07T07:38:46.2724660Z * [new branch] exclamaforte/do_bench_refactor -> origin/exclamaforte/do_bench_refactor 2025-09-07T07:38:46.2725512Z * [new branch] exclamaforte/enable-mem-dep-fusion -> origin/exclamaforte/enable-mem-dep-fusion 2025-09-07T07:38:46.2726050Z * [new branch] exclamaforte/fix-exhaustive-autotuning -> origin/exclamaforte/fix-exhaustive-autotuning 2025-09-07T07:38:46.2726648Z * [new branch] exclamaforte/fix-exhuastive-autotuning-reland -> origin/exclamaforte/fix-exhuastive-autotuning-reland 2025-09-07T07:38:46.2727154Z * [new branch] exclamaforte/fix-trace-parsing-fx-svg -> origin/exclamaforte/fix-trace-parsing-fx-svg 2025-09-07T07:38:46.2727647Z * [new branch] exclamaforte/force-pointwise-cat-perf-run -> origin/exclamaforte/force-pointwise-cat-perf-run 2025-09-07T07:38:46.2728131Z * [new branch] exclamaforte/fusion-data -> origin/exclamaforte/fusion-data 2025-09-07T07:38:46.2728690Z * [new branch] exclamaforte/gemm-benchmark-run -> origin/exclamaforte/gemm-benchmark-run 2025-09-07T07:38:46.2729227Z * [new branch] exclamaforte/gemm-export-model -> origin/exclamaforte/gemm-export-model 2025-09-07T07:38:46.2729718Z * [new branch] exclamaforte/gemm-model -> origin/exclamaforte/gemm-model 2025-09-07T07:38:46.2730391Z * [new branch] exclamaforte/gemm-model-all-data-collection -> origin/exclamaforte/gemm-model-all-data-collection 2025-09-07T07:38:46.2730937Z * [new branch] exclamaforte/gemm-to-amd -> origin/exclamaforte/gemm-to-amd 2025-09-07T07:38:46.2731487Z * [new branch] exclamaforte/just-gemm-model -> origin/exclamaforte/just-gemm-model 2025-09-07T07:38:46.2732087Z * [new branch] exclamaforte/just-gemm-model-no-refactor -> origin/exclamaforte/just-gemm-model-no-refactor 2025-09-07T07:38:46.2732584Z * [new branch] exclamaforte/max-autotune-ieee -> origin/exclamaforte/max-autotune-ieee 2025-09-07T07:38:46.2733090Z * [new branch] exclamaforte/memory-counter -> origin/exclamaforte/memory-counter 2025-09-07T07:38:46.2733611Z * [new branch] exclamaforte/profile-diff-algo -> origin/exclamaforte/profile-diff-algo 2025-09-07T07:38:46.2734162Z * [new branch] exclamaforte/profiler-combo -> origin/exclamaforte/profiler-combo 2025-09-07T07:38:46.2734753Z * [new branch] exclamaforte/test_cpp_wrapper_mode -> origin/exclamaforte/test_cpp_wrapper_mode 2025-09-07T07:38:46.2735258Z * [new branch] exclamaforte/update-autotune-configs -> origin/exclamaforte/update-autotune-configs 2025-09-07T07:38:46.2735797Z * [new branch] exclamaforte/update-autotune-configs-2 -> origin/exclamaforte/update-autotune-configs-2 2025-09-07T07:38:46.2736719Z * [new branch] exclamforte/gemm-model-final -> origin/exclamforte/gemm-model-final 2025-09-07T07:38:46.2737647Z * [new branch] exec -> origin/exec 2025-09-07T07:38:46.2738172Z * [new branch] executorch-module-shim -> origin/executorch-module-shim 2025-09-07T07:38:46.2738840Z * [new branch] experimental-mosaic -> origin/experimental-mosaic 2025-09-07T07:38:46.2739401Z * [new branch] export-D58091437 -> origin/export-D58091437 2025-09-07T07:38:46.2740201Z * [new branch] export-D61047529 -> origin/export-D61047529 2025-09-07T07:38:46.2740696Z * [new branch] export-D70112642 -> origin/export-D70112642 2025-09-07T07:38:46.2741295Z * [new branch] export-D71412006 -> origin/export-D71412006 2025-09-07T07:38:46.2742143Z * [new branch] export-D73042989 -> origin/export-D73042989 2025-09-07T07:38:46.2742638Z * [new branch] export-D75183591 -> origin/export-D75183591 2025-09-07T07:38:46.2743209Z * [new branch] export-D75617432 -> origin/export-D75617432 2025-09-07T07:38:46.2743744Z * [new branch] export-D75659965 -> origin/export-D75659965 2025-09-07T07:38:46.2744286Z * [new branch] export-D76080931 -> origin/export-D76080931 2025-09-07T07:38:46.2744812Z * [new branch] export-D76797250 -> origin/export-D76797250 2025-09-07T07:38:46.2745338Z * [new branch] export-D76885271 -> origin/export-D76885271 2025-09-07T07:38:46.2745878Z * [new branch] export-D76885620 -> origin/export-D76885620 2025-09-07T07:38:46.2746442Z * [new branch] export-D76936623 -> origin/export-D76936623 2025-09-07T07:38:46.2746977Z * [new branch] export-D76958268 -> origin/export-D76958268 2025-09-07T07:38:46.2747509Z * [new branch] export-D78375400 -> origin/export-D78375400 2025-09-07T07:38:46.2748255Z * [new branch] export-D78431305 -> origin/export-D78431305 2025-09-07T07:38:46.2748768Z * [new branch] export-D78580107 -> origin/export-D78580107 2025-09-07T07:38:46.2749309Z * [new branch] export-D78822171 -> origin/export-D78822171 2025-09-07T07:38:46.2749814Z * [new branch] export-D78822351 -> origin/export-D78822351 2025-09-07T07:38:46.2750328Z * [new branch] export-D78822507 -> origin/export-D78822507 2025-09-07T07:38:46.2750852Z * [new branch] export-D78826994 -> origin/export-D78826994 2025-09-07T07:38:46.2751380Z * [new branch] export-D78894324 -> origin/export-D78894324 2025-09-07T07:38:46.2751907Z * [new branch] export-D78929245 -> origin/export-D78929245 2025-09-07T07:38:46.2752401Z * [new branch] export-D78934925 -> origin/export-D78934925 2025-09-07T07:38:46.2752964Z * [new branch] export-D78953203 -> origin/export-D78953203 2025-09-07T07:38:46.2753516Z * [new branch] export-D78953229 -> origin/export-D78953229 2025-09-07T07:38:46.2753979Z * [new branch] export-D78957093 -> origin/export-D78957093 2025-09-07T07:38:46.2754483Z * [new branch] export-D78957389 -> origin/export-D78957389 2025-09-07T07:38:46.2755093Z * [new branch] export-D78996107 -> origin/export-D78996107 2025-09-07T07:38:46.2755534Z * [new branch] export-D79026433 -> origin/export-D79026433 2025-09-07T07:38:46.2756288Z * [new branch] export-D79230339 -> origin/export-D79230339 2025-09-07T07:38:46.2756749Z * [new branch] export-D79319835 -> origin/export-D79319835 2025-09-07T07:38:46.2757267Z * [new branch] export-D79328456 -> origin/export-D79328456 2025-09-07T07:38:46.2757843Z * [new branch] export-D79534608 -> origin/export-D79534608 2025-09-07T07:38:46.2758658Z * [new branch] export-D79785974 -> origin/export-D79785974 2025-09-07T07:38:46.2759187Z * [new branch] export-D80025417 -> origin/export-D80025417 2025-09-07T07:38:46.2759769Z * [new branch] export-D80120333 -> origin/export-D80120333 2025-09-07T07:38:46.2760380Z * [new branch] export-D80214882 -> origin/export-D80214882 2025-09-07T07:38:46.2760896Z * [new branch] export-D80319069 -> origin/export-D80319069 2025-09-07T07:38:46.2761458Z * [new branch] export-D80321215 -> origin/export-D80321215 2025-09-07T07:38:46.2762012Z * [new branch] export-D80503451 -> origin/export-D80503451 2025-09-07T07:38:46.2762504Z * [new branch] export-D80771648 -> origin/export-D80771648 2025-09-07T07:38:46.2763027Z * [new branch] export-D80823877 -> origin/export-D80823877 2025-09-07T07:38:46.2763563Z * [new branch] export-D80948073 -> origin/export-D80948073 2025-09-07T07:38:46.2764305Z * [new branch] export-D80958642 -> origin/export-D80958642 2025-09-07T07:38:46.2764852Z * [new branch] export-D80970483 -> origin/export-D80970483 2025-09-07T07:38:46.2765385Z * [new branch] export-D81054193 -> origin/export-D81054193 2025-09-07T07:38:46.2765880Z * [new branch] export-D81060182 -> origin/export-D81060182 2025-09-07T07:38:46.2766464Z * [new branch] export-D81078973 -> origin/export-D81078973 2025-09-07T07:38:46.2767010Z * [new branch] export-D81204584 -> origin/export-D81204584 2025-09-07T07:38:46.2767599Z * [new branch] export-D81284190 -> origin/export-D81284190 2025-09-07T07:38:46.2768171Z * [new branch] export-D81299840 -> origin/export-D81299840 2025-09-07T07:38:46.2769052Z * [new branch] export-D81429090 -> origin/export-D81429090 2025-09-07T07:38:46.2769532Z * [new branch] export-D81698719 -> origin/export-D81698719 2025-09-07T07:38:46.2770085Z * [new branch] export-D81747409 -> origin/export-D81747409 2025-09-07T07:38:46.2770769Z * [new branch] exported-model-train-idempotent -> origin/exported-model-train-idempotent 2025-09-07T07:38:46.2771658Z * [new branch] ezyang/wip-aot-descriptors -> origin/ezyang/wip-aot-descriptors 2025-09-07T07:38:46.2772082Z * [new branch] fa_u8_brgemm -> origin/fa_u8_brgemm 2025-09-07T07:38:46.2772687Z * [new branch] fastmath_baseline -> origin/fastmath_baseline 2025-09-07T07:38:46.2773658Z * [new branch] fbcode/warm -> origin/fbcode/warm 2025-09-07T07:38:46.2774430Z * [new branch] fca -> origin/fca 2025-09-07T07:38:46.2774889Z * [new branch] fca2_ca5984c -> origin/fca2_ca5984c 2025-09-07T07:38:46.2775416Z * [new branch] fca5 -> origin/fca5 2025-09-07T07:38:46.2776371Z * [new branch] feature/function-numa-binding -> origin/feature/function-numa-binding 2025-09-07T07:38:46.2776861Z * [new branch] feature/function-numa-binding-take2 -> origin/feature/function-numa-binding-take2 2025-09-07T07:38:46.2777349Z * [new branch] feature/numa-nproc-fix -> origin/feature/numa-nproc-fix 2025-09-07T07:38:46.2777867Z * [new branch] feature/numa-signpost-serialize -> origin/feature/numa-signpost-serialize 2025-09-07T07:38:46.2778417Z * [new branch] feature/parallel-numa-binding -> origin/feature/parallel-numa-binding 2025-09-07T07:38:46.2779335Z * [new branch] fengyuan/external-proj -> origin/fengyuan/external-proj 2025-09-07T07:38:46.2779873Z * [new branch] fengyuan/out-of-tree-xpu-ops-improve-test -> origin/fengyuan/out-of-tree-xpu-ops-improve-test 2025-09-07T07:38:46.2780370Z * [new branch] fengyuan/out-of-tree-xpu-ops-remove-dtype -> origin/fengyuan/out-of-tree-xpu-ops-remove-dtype 2025-09-07T07:38:46.2780773Z * [new branch] fengyuan/test-xpu -> origin/fengyuan/test-xpu 2025-09-07T07:38:46.2782137Z * [new branch] ffast_math_baseline -> origin/ffast_math_baseline 2025-09-07T07:38:46.2782588Z * [new branch] ffast_math_target -> origin/ffast_math_target 2025-09-07T07:38:46.2783547Z * [new branch] findhao/base_commit -> origin/findhao/base_commit 2025-09-07T07:38:46.2784172Z * [new branch] findhao/base_commit1 -> origin/findhao/base_commit1 2025-09-07T07:38:46.2784663Z * [new branch] findhao/multistream2 -> origin/findhao/multistream2 2025-09-07T07:38:46.2785170Z * [new branch] findhao/multistream5 -> origin/findhao/multistream5 2025-09-07T07:38:46.2785661Z * [new branch] findhao/multistream6 -> origin/findhao/multistream6 2025-09-07T07:38:46.2786228Z * [new branch] findhao/operatorbench3 -> origin/findhao/operatorbench3 2025-09-07T07:38:46.2786790Z * [new branch] findhao/operatorbench5 -> origin/findhao/operatorbench5 2025-09-07T07:38:46.2787290Z * [new branch] findhao/tritonparse -> origin/findhao/tritonparse 2025-09-07T07:38:46.2787824Z * [new branch] fix -> origin/fix 2025-09-07T07:38:46.2788460Z * [new branch] fix-ck-gemm-template-format -> origin/fix-ck-gemm-template-format 2025-09-07T07:38:46.2788954Z * [new branch] fix-config-ignore -> origin/fix-config-ignore 2025-09-07T07:38:46.2789472Z * [new branch] fix-dict-guard -> origin/fix-dict-guard 2025-09-07T07:38:46.2790026Z * [new branch] fix-inductor-periodic-0528 -> origin/fix-inductor-periodic-0528 2025-09-07T07:38:46.2790509Z * [new branch] fix-mps-benchmark -> origin/fix-mps-benchmark 2025-09-07T07:38:46.2791100Z * [new branch] fix-rlease-feature-template -> origin/fix-rlease-feature-template 2025-09-07T07:38:46.2791966Z * [new branch] fix-run-condition-upload-results -> origin/fix-run-condition-upload-results 2025-09-07T07:38:46.2792338Z * [new branch] fix-torchbench -> origin/fix-torchbench 2025-09-07T07:38:46.2792833Z * [new branch] fix_153389 -> origin/fix_153389 2025-09-07T07:38:46.2793410Z * [new branch] fix_fsdp_rs_bucket2 -> origin/fix_fsdp_rs_bucket2 2025-09-07T07:38:46.2793997Z * [new branch] fix_inductor_peridic_tests -> origin/fix_inductor_peridic_tests 2025-09-07T07:38:46.2794467Z * [new branch] fix_ubn_159469 -> origin/fix_ubn_159469 2025-09-07T07:38:46.2795046Z * [new branch] fixes-triage -> origin/fixes-triage 2025-09-07T07:38:46.2795598Z * [new branch] fixflashinfer -> origin/fixflashinfer 2025-09-07T07:38:46.2796111Z * [new branch] flash_decoding_cpu -> origin/flash_decoding_cpu 2025-09-07T07:38:46.2796736Z * [new branch] flex-flash -> origin/flex-flash 2025-09-07T07:38:46.2797209Z * [new branch] flex-lowering -> origin/flex-lowering 2025-09-07T07:38:46.2797754Z * [new branch] flex-warning -> origin/flex-warning 2025-09-07T07:38:46.2798363Z * [new branch] flex_attention_functorch_grad -> origin/flex_attention_functorch_grad 2025-09-07T07:38:46.2799310Z * [new branch] flex_flash -> origin/flex_flash 2025-09-07T07:38:46.2799874Z * [new branch] flexdecode-gqa-groups -> origin/flexdecode-gqa-groups 2025-09-07T07:38:46.2800976Z * [new branch] fmassa/fix_memeff_sharding_rule -> origin/fmassa/fix_memeff_sharding_rule 2025-09-07T07:38:46.2801473Z * [new branch] fsdp2_trace_rules -> origin/fsdp2_trace_rules 2025-09-07T07:38:46.2802065Z * [new branch] fsdpv2_3d -> origin/fsdpv2_3d 2025-09-07T07:38:46.2802787Z * [new branch] fsdpv2_3d_m1 -> origin/fsdpv2_3d_m1 2025-09-07T07:38:46.2803289Z * [new branch] fx_cpp -> origin/fx_cpp 2025-09-07T07:38:46.2804246Z * [new branch] fy/fix-win -> origin/fy/fix-win 2025-09-07T07:38:46.2805832Z * [new branch] gh/AlnisM/1/base -> origin/gh/AlnisM/1/base 2025-09-07T07:38:46.2806269Z * [new branch] gh/AlnisM/1/head -> origin/gh/AlnisM/1/head 2025-09-07T07:38:46.2807281Z * [new branch] gh/CaoE/2/base -> origin/gh/CaoE/2/base 2025-09-07T07:38:46.2807716Z * [new branch] gh/CaoE/2/head -> origin/gh/CaoE/2/head 2025-09-07T07:38:46.2808263Z * [new branch] gh/CaoE/2/orig -> origin/gh/CaoE/2/orig 2025-09-07T07:38:46.2809488Z * [new branch] gh/ColinPeppler/79/base -> origin/gh/ColinPeppler/79/base 2025-09-07T07:38:46.2810011Z * [new branch] gh/ColinPeppler/79/head -> origin/gh/ColinPeppler/79/head 2025-09-07T07:38:46.2810526Z * [new branch] gh/ColinPeppler/79/orig -> origin/gh/ColinPeppler/79/orig 2025-09-07T07:38:46.2811534Z * [new branch] gh/ColinPeppler/80/base -> origin/gh/ColinPeppler/80/base 2025-09-07T07:38:46.2812318Z * [new branch] gh/ColinPeppler/80/head -> origin/gh/ColinPeppler/80/head 2025-09-07T07:38:46.2812781Z * [new branch] gh/ColinPeppler/80/orig -> origin/gh/ColinPeppler/80/orig 2025-09-07T07:38:46.2813890Z * [new branch] gh/EikanWang/67/base -> origin/gh/EikanWang/67/base 2025-09-07T07:38:46.2814381Z * [new branch] gh/EikanWang/67/head -> origin/gh/EikanWang/67/head 2025-09-07T07:38:46.2815220Z * [new branch] gh/EikanWang/80/base -> origin/gh/EikanWang/80/base 2025-09-07T07:38:46.2815668Z * [new branch] gh/EikanWang/80/head -> origin/gh/EikanWang/80/head 2025-09-07T07:38:46.2816208Z * [new branch] gh/EikanWang/80/orig -> origin/gh/EikanWang/80/orig 2025-09-07T07:38:46.2817090Z * [new branch] gh/EikanWang/81/base -> origin/gh/EikanWang/81/base 2025-09-07T07:38:46.2817556Z * [new branch] gh/EikanWang/81/head -> origin/gh/EikanWang/81/head 2025-09-07T07:38:46.2818066Z * [new branch] gh/EikanWang/81/orig -> origin/gh/EikanWang/81/orig 2025-09-07T07:38:46.2818969Z * [new branch] gh/EikanWang/82/base -> origin/gh/EikanWang/82/base 2025-09-07T07:38:46.2819393Z * [new branch] gh/EikanWang/82/head -> origin/gh/EikanWang/82/head 2025-09-07T07:38:46.2819947Z * [new branch] gh/EikanWang/82/orig -> origin/gh/EikanWang/82/orig 2025-09-07T07:38:46.2821360Z * [new branch] gh/Gasoonjia/1/base -> origin/gh/Gasoonjia/1/base 2025-09-07T07:38:46.2821834Z * [new branch] gh/Gasoonjia/1/head -> origin/gh/Gasoonjia/1/head 2025-09-07T07:38:46.2823237Z * [new branch] gh/H-Huang/131/base -> origin/gh/H-Huang/131/base 2025-09-07T07:38:46.2823653Z * [new branch] gh/H-Huang/131/head -> origin/gh/H-Huang/131/head 2025-09-07T07:38:46.2824212Z * [new branch] gh/H-Huang/131/orig -> origin/gh/H-Huang/131/orig 2025-09-07T07:38:46.2825111Z * [new branch] gh/H-Huang/132/base -> origin/gh/H-Huang/132/base 2025-09-07T07:38:46.2825522Z * [new branch] gh/H-Huang/132/head -> origin/gh/H-Huang/132/head 2025-09-07T07:38:46.2826068Z * [new branch] gh/H-Huang/132/orig -> origin/gh/H-Huang/132/orig 2025-09-07T07:38:46.2826962Z * [new branch] gh/H-Huang/180/base -> origin/gh/H-Huang/180/base 2025-09-07T07:38:46.2827376Z * [new branch] gh/H-Huang/180/head -> origin/gh/H-Huang/180/head 2025-09-07T07:38:46.2827908Z * [new branch] gh/H-Huang/180/orig -> origin/gh/H-Huang/180/orig 2025-09-07T07:38:46.2828720Z * [new branch] gh/H-Huang/182/base -> origin/gh/H-Huang/182/base 2025-09-07T07:38:46.2829111Z * [new branch] gh/H-Huang/182/head -> origin/gh/H-Huang/182/head 2025-09-07T07:38:46.2829640Z * [new branch] gh/H-Huang/182/orig -> origin/gh/H-Huang/182/orig 2025-09-07T07:38:46.2830590Z * [new branch] gh/H-Huang/187/base -> origin/gh/H-Huang/187/base 2025-09-07T07:38:46.2830996Z * [new branch] gh/H-Huang/187/head -> origin/gh/H-Huang/187/head 2025-09-07T07:38:46.2831517Z * [new branch] gh/H-Huang/187/orig -> origin/gh/H-Huang/187/orig 2025-09-07T07:38:46.2832469Z * [new branch] gh/H-Huang/202/base -> origin/gh/H-Huang/202/base 2025-09-07T07:38:46.2832969Z * [new branch] gh/H-Huang/202/head -> origin/gh/H-Huang/202/head 2025-09-07T07:38:46.2833501Z * [new branch] gh/H-Huang/202/orig -> origin/gh/H-Huang/202/orig 2025-09-07T07:38:46.2834248Z * [new branch] gh/H-Huang/203/base -> origin/gh/H-Huang/203/base 2025-09-07T07:38:46.2834731Z * [new branch] gh/H-Huang/203/head -> origin/gh/H-Huang/203/head 2025-09-07T07:38:46.2835243Z * [new branch] gh/H-Huang/203/orig -> origin/gh/H-Huang/203/orig 2025-09-07T07:38:46.2836126Z * [new branch] gh/H-Huang/204/base -> origin/gh/H-Huang/204/base 2025-09-07T07:38:46.2836558Z * [new branch] gh/H-Huang/204/head -> origin/gh/H-Huang/204/head 2025-09-07T07:38:46.2837074Z * [new branch] gh/H-Huang/204/orig -> origin/gh/H-Huang/204/orig 2025-09-07T07:38:46.2837909Z * [new branch] gh/H-Huang/205/base -> origin/gh/H-Huang/205/base 2025-09-07T07:38:46.2838368Z * [new branch] gh/H-Huang/205/head -> origin/gh/H-Huang/205/head 2025-09-07T07:38:46.2838909Z * [new branch] gh/H-Huang/205/orig -> origin/gh/H-Huang/205/orig 2025-09-07T07:38:46.2839736Z * [new branch] gh/H-Huang/206/base -> origin/gh/H-Huang/206/base 2025-09-07T07:38:46.2840212Z * [new branch] gh/H-Huang/206/head -> origin/gh/H-Huang/206/head 2025-09-07T07:38:46.2840964Z * [new branch] gh/H-Huang/206/orig -> origin/gh/H-Huang/206/orig 2025-09-07T07:38:46.2841706Z * [new branch] gh/H-Huang/207/base -> origin/gh/H-Huang/207/base 2025-09-07T07:38:46.2842146Z * [new branch] gh/H-Huang/207/head -> origin/gh/H-Huang/207/head 2025-09-07T07:38:46.2842701Z * [new branch] gh/H-Huang/207/orig -> origin/gh/H-Huang/207/orig 2025-09-07T07:38:46.2843537Z * [new branch] gh/H-Huang/208/base -> origin/gh/H-Huang/208/base 2025-09-07T07:38:46.2843945Z * [new branch] gh/H-Huang/208/head -> origin/gh/H-Huang/208/head 2025-09-07T07:38:46.2844562Z * [new branch] gh/H-Huang/208/orig -> origin/gh/H-Huang/208/orig 2025-09-07T07:38:46.2845350Z * [new branch] gh/H-Huang/209/base -> origin/gh/H-Huang/209/base 2025-09-07T07:38:46.2845791Z * [new branch] gh/H-Huang/209/head -> origin/gh/H-Huang/209/head 2025-09-07T07:38:46.2846323Z * [new branch] gh/H-Huang/209/orig -> origin/gh/H-Huang/209/orig 2025-09-07T07:38:46.2847156Z * [new branch] gh/H-Huang/210/base -> origin/gh/H-Huang/210/base 2025-09-07T07:38:46.2847607Z * [new branch] gh/H-Huang/210/head -> origin/gh/H-Huang/210/head 2025-09-07T07:38:46.2848135Z * [new branch] gh/H-Huang/210/orig -> origin/gh/H-Huang/210/orig 2025-09-07T07:38:46.2848975Z * [new branch] gh/H-Huang/211/base -> origin/gh/H-Huang/211/base 2025-09-07T07:38:46.2849412Z * [new branch] gh/H-Huang/211/head -> origin/gh/H-Huang/211/head 2025-09-07T07:38:46.2850042Z * [new branch] gh/H-Huang/211/orig -> origin/gh/H-Huang/211/orig 2025-09-07T07:38:46.2850884Z * [new branch] gh/H-Huang/212/base -> origin/gh/H-Huang/212/base 2025-09-07T07:38:46.2851367Z * [new branch] gh/H-Huang/212/head -> origin/gh/H-Huang/212/head 2025-09-07T07:38:46.2851891Z * [new branch] gh/H-Huang/212/orig -> origin/gh/H-Huang/212/orig 2025-09-07T07:38:46.2853216Z * [new branch] gh/H-Huang/213/base -> origin/gh/H-Huang/213/base 2025-09-07T07:38:46.2853694Z * [new branch] gh/H-Huang/213/head -> origin/gh/H-Huang/213/head 2025-09-07T07:38:46.2854185Z * [new branch] gh/H-Huang/213/orig -> origin/gh/H-Huang/213/orig 2025-09-07T07:38:46.2855107Z * [new branch] gh/H-Huang/214/base -> origin/gh/H-Huang/214/base 2025-09-07T07:38:46.2855565Z * [new branch] gh/H-Huang/214/head -> origin/gh/H-Huang/214/head 2025-09-07T07:38:46.2856068Z * [new branch] gh/H-Huang/214/orig -> origin/gh/H-Huang/214/orig 2025-09-07T07:38:46.2857181Z * [new branch] gh/IvanKobzarev/112/base -> origin/gh/IvanKobzarev/112/base 2025-09-07T07:38:46.2857676Z * [new branch] gh/IvanKobzarev/112/head -> origin/gh/IvanKobzarev/112/head 2025-09-07T07:38:46.2858200Z * [new branch] gh/IvanKobzarev/112/orig -> origin/gh/IvanKobzarev/112/orig 2025-09-07T07:38:46.2859110Z * [new branch] gh/IvanKobzarev/115/base -> origin/gh/IvanKobzarev/115/base 2025-09-07T07:38:46.2859654Z * [new branch] gh/IvanKobzarev/115/head -> origin/gh/IvanKobzarev/115/head 2025-09-07T07:38:46.2860440Z * [new branch] gh/IvanKobzarev/115/orig -> origin/gh/IvanKobzarev/115/orig 2025-09-07T07:38:46.2861479Z * [new branch] gh/IvanKobzarev/116/base -> origin/gh/IvanKobzarev/116/base 2025-09-07T07:38:46.2861995Z * [new branch] gh/IvanKobzarev/116/head -> origin/gh/IvanKobzarev/116/head 2025-09-07T07:38:46.2862560Z * [new branch] gh/IvanKobzarev/116/orig -> origin/gh/IvanKobzarev/116/orig 2025-09-07T07:38:46.2863473Z * [new branch] gh/IvanKobzarev/118/base -> origin/gh/IvanKobzarev/118/base 2025-09-07T07:38:46.2863947Z * [new branch] gh/IvanKobzarev/118/head -> origin/gh/IvanKobzarev/118/head 2025-09-07T07:38:46.2864470Z * [new branch] gh/IvanKobzarev/118/orig -> origin/gh/IvanKobzarev/118/orig 2025-09-07T07:38:46.2865480Z * [new branch] gh/IvanKobzarev/126/base -> origin/gh/IvanKobzarev/126/base 2025-09-07T07:38:46.2865992Z * [new branch] gh/IvanKobzarev/126/head -> origin/gh/IvanKobzarev/126/head 2025-09-07T07:38:46.2866519Z * [new branch] gh/IvanKobzarev/126/orig -> origin/gh/IvanKobzarev/126/orig 2025-09-07T07:38:46.2867476Z * [new branch] gh/IvanKobzarev/127/base -> origin/gh/IvanKobzarev/127/base 2025-09-07T07:38:46.2868019Z * [new branch] gh/IvanKobzarev/127/head -> origin/gh/IvanKobzarev/127/head 2025-09-07T07:38:46.2868535Z * [new branch] gh/IvanKobzarev/127/orig -> origin/gh/IvanKobzarev/127/orig 2025-09-07T07:38:46.2869448Z * [new branch] gh/IvanKobzarev/128/base -> origin/gh/IvanKobzarev/128/base 2025-09-07T07:38:46.2869949Z * [new branch] gh/IvanKobzarev/128/head -> origin/gh/IvanKobzarev/128/head 2025-09-07T07:38:46.2871035Z * [new branch] gh/IvanKobzarev/128/orig -> origin/gh/IvanKobzarev/128/orig 2025-09-07T07:38:46.2871515Z * [new branch] gh/IvanKobzarev/132/base -> origin/gh/IvanKobzarev/132/base 2025-09-07T07:38:46.2872080Z * [new branch] gh/IvanKobzarev/132/head -> origin/gh/IvanKobzarev/132/head 2025-09-07T07:38:46.2872615Z * [new branch] gh/IvanKobzarev/132/orig -> origin/gh/IvanKobzarev/132/orig 2025-09-07T07:38:46.2873878Z * [new branch] gh/IvanKobzarev/133/base -> origin/gh/IvanKobzarev/133/base 2025-09-07T07:38:46.2874514Z * [new branch] gh/IvanKobzarev/133/head -> origin/gh/IvanKobzarev/133/head 2025-09-07T07:38:46.2875091Z * [new branch] gh/IvanKobzarev/133/orig -> origin/gh/IvanKobzarev/133/orig 2025-09-07T07:38:46.2875941Z * [new branch] gh/IvanKobzarev/134/base -> origin/gh/IvanKobzarev/134/base 2025-09-07T07:38:46.2876327Z * [new branch] gh/IvanKobzarev/134/head -> origin/gh/IvanKobzarev/134/head 2025-09-07T07:38:46.2876846Z * [new branch] gh/IvanKobzarev/134/orig -> origin/gh/IvanKobzarev/134/orig 2025-09-07T07:38:46.2877962Z * [new branch] gh/IvanKobzarev/135/base -> origin/gh/IvanKobzarev/135/base 2025-09-07T07:38:46.2878409Z * [new branch] gh/IvanKobzarev/135/head -> origin/gh/IvanKobzarev/135/head 2025-09-07T07:38:46.2878942Z * [new branch] gh/IvanKobzarev/135/orig -> origin/gh/IvanKobzarev/135/orig 2025-09-07T07:38:46.2879938Z * [new branch] gh/IvanKobzarev/136/base -> origin/gh/IvanKobzarev/136/base 2025-09-07T07:38:46.2880395Z * [new branch] gh/IvanKobzarev/136/head -> origin/gh/IvanKobzarev/136/head 2025-09-07T07:38:46.2881265Z * [new branch] gh/IvanKobzarev/136/orig -> origin/gh/IvanKobzarev/136/orig 2025-09-07T07:38:46.2881830Z * [new branch] gh/IvanKobzarev/137/base -> origin/gh/IvanKobzarev/137/base 2025-09-07T07:38:46.2882338Z * [new branch] gh/IvanKobzarev/137/head -> origin/gh/IvanKobzarev/137/head 2025-09-07T07:38:46.2882863Z * [new branch] gh/IvanKobzarev/137/orig -> origin/gh/IvanKobzarev/137/orig 2025-09-07T07:38:46.2883758Z * [new branch] gh/IvanKobzarev/138/base -> origin/gh/IvanKobzarev/138/base 2025-09-07T07:38:46.2884426Z * [new branch] gh/IvanKobzarev/138/head -> origin/gh/IvanKobzarev/138/head 2025-09-07T07:38:46.2885035Z * [new branch] gh/IvanKobzarev/138/orig -> origin/gh/IvanKobzarev/138/orig 2025-09-07T07:38:46.2885918Z * [new branch] gh/IvanKobzarev/139/base -> origin/gh/IvanKobzarev/139/base 2025-09-07T07:38:46.2886405Z * [new branch] gh/IvanKobzarev/139/head -> origin/gh/IvanKobzarev/139/head 2025-09-07T07:38:46.2886944Z * [new branch] gh/IvanKobzarev/139/orig -> origin/gh/IvanKobzarev/139/orig 2025-09-07T07:38:46.2887928Z * [new branch] gh/IvanKobzarev/140/base -> origin/gh/IvanKobzarev/140/base 2025-09-07T07:38:46.2888350Z * [new branch] gh/IvanKobzarev/140/head -> origin/gh/IvanKobzarev/140/head 2025-09-07T07:38:46.2888886Z * [new branch] gh/IvanKobzarev/140/orig -> origin/gh/IvanKobzarev/140/orig 2025-09-07T07:38:46.2890151Z * [new branch] gh/IvanKobzarev/141/base -> origin/gh/IvanKobzarev/141/base 2025-09-07T07:38:46.2890982Z * [new branch] gh/IvanKobzarev/141/head -> origin/gh/IvanKobzarev/141/head 2025-09-07T07:38:46.2891572Z * [new branch] gh/IvanKobzarev/141/orig -> origin/gh/IvanKobzarev/141/orig 2025-09-07T07:38:46.2892650Z * [new branch] gh/IvanKobzarev/142/base -> origin/gh/IvanKobzarev/142/base 2025-09-07T07:38:46.2893087Z * [new branch] gh/IvanKobzarev/142/head -> origin/gh/IvanKobzarev/142/head 2025-09-07T07:38:46.2895044Z * [new branch] gh/IvanKobzarev/142/orig -> origin/gh/IvanKobzarev/142/orig 2025-09-07T07:38:46.2895357Z * [new branch] gh/IvanKobzarev/143/base -> origin/gh/IvanKobzarev/143/base 2025-09-07T07:38:46.2895686Z * [new branch] gh/IvanKobzarev/143/head -> origin/gh/IvanKobzarev/143/head 2025-09-07T07:38:46.2896005Z * [new branch] gh/IvanKobzarev/143/orig -> origin/gh/IvanKobzarev/143/orig 2025-09-07T07:38:46.2897523Z * [new branch] gh/IvanKobzarev/144/base -> origin/gh/IvanKobzarev/144/base 2025-09-07T07:38:46.2897955Z * [new branch] gh/IvanKobzarev/144/head -> origin/gh/IvanKobzarev/144/head 2025-09-07T07:38:46.2898311Z * [new branch] gh/IvanKobzarev/144/orig -> origin/gh/IvanKobzarev/144/orig 2025-09-07T07:38:46.2898636Z * [new branch] gh/IvanKobzarev/145/base -> origin/gh/IvanKobzarev/145/base 2025-09-07T07:38:46.2898969Z * [new branch] gh/IvanKobzarev/145/head -> origin/gh/IvanKobzarev/145/head 2025-09-07T07:38:46.2899431Z * [new branch] gh/IvanKobzarev/145/orig -> origin/gh/IvanKobzarev/145/orig 2025-09-07T07:38:46.2900257Z * [new branch] gh/IvanKobzarev/146/base -> origin/gh/IvanKobzarev/146/base 2025-09-07T07:38:46.2901047Z * [new branch] gh/IvanKobzarev/146/head -> origin/gh/IvanKobzarev/146/head 2025-09-07T07:38:46.2901573Z * [new branch] gh/IvanKobzarev/146/orig -> origin/gh/IvanKobzarev/146/orig 2025-09-07T07:38:46.2902761Z * [new branch] gh/NikhilAPatel/1/base -> origin/gh/NikhilAPatel/1/base 2025-09-07T07:38:46.2903348Z * [new branch] gh/NikhilAPatel/1/head -> origin/gh/NikhilAPatel/1/head 2025-09-07T07:38:46.2904095Z * [new branch] gh/NikhilAPatel/2/base -> origin/gh/NikhilAPatel/2/base 2025-09-07T07:38:46.2904492Z * [new branch] gh/NikhilAPatel/2/head -> origin/gh/NikhilAPatel/2/head 2025-09-07T07:38:46.2905504Z * [new branch] gh/NikhilAPatel/4/base -> origin/gh/NikhilAPatel/4/base 2025-09-07T07:38:46.2906058Z * [new branch] gh/NikhilAPatel/4/head -> origin/gh/NikhilAPatel/4/head 2025-09-07T07:38:46.2907079Z * [new branch] gh/PaliC/1/base -> origin/gh/PaliC/1/base 2025-09-07T07:38:46.2907540Z * [new branch] gh/PaliC/1/head -> origin/gh/PaliC/1/head 2025-09-07T07:38:46.2908088Z * [new branch] gh/PaliC/1/orig -> origin/gh/PaliC/1/orig 2025-09-07T07:38:46.2908982Z * [new branch] gh/PaliC/17/base -> origin/gh/PaliC/17/base 2025-09-07T07:38:46.2909437Z * [new branch] gh/PaliC/17/head -> origin/gh/PaliC/17/head 2025-09-07T07:38:46.2910002Z * [new branch] gh/PaliC/17/orig -> origin/gh/PaliC/17/orig 2025-09-07T07:38:46.2910852Z * [new branch] gh/PaliC/18/base -> origin/gh/PaliC/18/base 2025-09-07T07:38:46.2911421Z * [new branch] gh/PaliC/18/head -> origin/gh/PaliC/18/head 2025-09-07T07:38:46.2911968Z * [new branch] gh/PaliC/18/orig -> origin/gh/PaliC/18/orig 2025-09-07T07:38:46.2912773Z * [new branch] gh/PaliC/2/base -> origin/gh/PaliC/2/base 2025-09-07T07:38:46.2913225Z * [new branch] gh/PaliC/2/head -> origin/gh/PaliC/2/head 2025-09-07T07:38:46.2913824Z * [new branch] gh/PaliC/2/orig -> origin/gh/PaliC/2/orig 2025-09-07T07:38:46.2914716Z * [new branch] gh/PaliC/20/base -> origin/gh/PaliC/20/base 2025-09-07T07:38:46.2915257Z * [new branch] gh/PaliC/20/head -> origin/gh/PaliC/20/head 2025-09-07T07:38:46.2915732Z * [new branch] gh/PaliC/20/orig -> origin/gh/PaliC/20/orig 2025-09-07T07:38:46.2916583Z * [new branch] gh/PaliC/21/base -> origin/gh/PaliC/21/base 2025-09-07T07:38:46.2917063Z * [new branch] gh/PaliC/21/head -> origin/gh/PaliC/21/head 2025-09-07T07:38:46.2917596Z * [new branch] gh/PaliC/21/orig -> origin/gh/PaliC/21/orig 2025-09-07T07:38:46.2918435Z * [new branch] gh/PaliC/22/base -> origin/gh/PaliC/22/base 2025-09-07T07:38:46.2918890Z * [new branch] gh/PaliC/22/head -> origin/gh/PaliC/22/head 2025-09-07T07:38:46.2919416Z * [new branch] gh/PaliC/22/orig -> origin/gh/PaliC/22/orig 2025-09-07T07:38:46.2920221Z * [new branch] gh/PaliC/23/base -> origin/gh/PaliC/23/base 2025-09-07T07:38:46.2920841Z * [new branch] gh/PaliC/23/head -> origin/gh/PaliC/23/head 2025-09-07T07:38:46.2921387Z * [new branch] gh/PaliC/23/orig -> origin/gh/PaliC/23/orig 2025-09-07T07:38:46.2922196Z * [new branch] gh/PaliC/24/base -> origin/gh/PaliC/24/base 2025-09-07T07:38:46.2922681Z * [new branch] gh/PaliC/24/head -> origin/gh/PaliC/24/head 2025-09-07T07:38:46.2923206Z * [new branch] gh/PaliC/24/orig -> origin/gh/PaliC/24/orig 2025-09-07T07:38:46.2924264Z * [new branch] gh/PaulZhang12/17/base -> origin/gh/PaulZhang12/17/base 2025-09-07T07:38:46.2924712Z * [new branch] gh/PaulZhang12/17/head -> origin/gh/PaulZhang12/17/head 2025-09-07T07:38:46.2925725Z * [new branch] gh/PaulZhang12/20/base -> origin/gh/PaulZhang12/20/base 2025-09-07T07:38:46.2926166Z * [new branch] gh/PaulZhang12/20/head -> origin/gh/PaulZhang12/20/head 2025-09-07T07:38:46.2926680Z * [new branch] gh/PaulZhang12/20/orig -> origin/gh/PaulZhang12/20/orig 2025-09-07T07:38:46.2927564Z * [new branch] gh/PaulZhang12/21/base -> origin/gh/PaulZhang12/21/base 2025-09-07T07:38:46.2928096Z * [new branch] gh/PaulZhang12/21/head -> origin/gh/PaulZhang12/21/head 2025-09-07T07:38:46.2928629Z * [new branch] gh/PaulZhang12/21/orig -> origin/gh/PaulZhang12/21/orig 2025-09-07T07:38:46.2929510Z * [new branch] gh/PaulZhang12/22/base -> origin/gh/PaulZhang12/22/base 2025-09-07T07:38:46.2929954Z * [new branch] gh/PaulZhang12/22/head -> origin/gh/PaulZhang12/22/head 2025-09-07T07:38:46.2930740Z * [new branch] gh/PaulZhang12/22/orig -> origin/gh/PaulZhang12/22/orig 2025-09-07T07:38:46.2931374Z * [new branch] gh/PaulZhang12/23/base -> origin/gh/PaulZhang12/23/base 2025-09-07T07:38:46.2931902Z * [new branch] gh/PaulZhang12/23/head -> origin/gh/PaulZhang12/23/head 2025-09-07T07:38:46.2932433Z * [new branch] gh/PaulZhang12/23/orig -> origin/gh/PaulZhang12/23/orig 2025-09-07T07:38:46.2933082Z * [new branch] gh/PaulZhang12/24/base -> origin/gh/PaulZhang12/24/base 2025-09-07T07:38:46.2933606Z * [new branch] gh/PaulZhang12/24/head -> origin/gh/PaulZhang12/24/head 2025-09-07T07:38:46.2934148Z * [new branch] gh/PaulZhang12/24/orig -> origin/gh/PaulZhang12/24/orig 2025-09-07T07:38:46.2934987Z * [new branch] gh/PaulZhang12/25/base -> origin/gh/PaulZhang12/25/base 2025-09-07T07:38:46.2935446Z * [new branch] gh/PaulZhang12/25/head -> origin/gh/PaulZhang12/25/head 2025-09-07T07:38:46.2935999Z * [new branch] gh/PaulZhang12/25/orig -> origin/gh/PaulZhang12/25/orig 2025-09-07T07:38:46.2937045Z * [new branch] gh/SamGinzburg/11/base -> origin/gh/SamGinzburg/11/base 2025-09-07T07:38:46.2937527Z * [new branch] gh/SamGinzburg/11/head -> origin/gh/SamGinzburg/11/head 2025-09-07T07:38:46.2938930Z * [new branch] gh/Sidharth123-cpu/24/base -> origin/gh/Sidharth123-cpu/24/base 2025-09-07T07:38:46.2939554Z * [new branch] gh/Sidharth123-cpu/25/base -> origin/gh/Sidharth123-cpu/25/base 2025-09-07T07:38:46.2940428Z * [new branch] gh/Sidharth123-cpu/26/base -> origin/gh/Sidharth123-cpu/26/base 2025-09-07T07:38:46.2941225Z * [new branch] gh/Sidharth123-cpu/27/base -> origin/gh/Sidharth123-cpu/27/base 2025-09-07T07:38:46.2942244Z * [new branch] gh/StrongerXi/1/base -> origin/gh/StrongerXi/1/base 2025-09-07T07:38:46.2942679Z * [new branch] gh/StrongerXi/1/head -> origin/gh/StrongerXi/1/head 2025-09-07T07:38:46.2943538Z * [new branch] gh/StrongerXi/133/base -> origin/gh/StrongerXi/133/base 2025-09-07T07:38:46.2944044Z * [new branch] gh/StrongerXi/133/head -> origin/gh/StrongerXi/133/head 2025-09-07T07:38:46.2944572Z * [new branch] gh/StrongerXi/133/orig -> origin/gh/StrongerXi/133/orig 2025-09-07T07:38:46.2945417Z * [new branch] gh/StrongerXi/134/base -> origin/gh/StrongerXi/134/base 2025-09-07T07:38:46.2945854Z * [new branch] gh/StrongerXi/134/head -> origin/gh/StrongerXi/134/head 2025-09-07T07:38:46.2946367Z * [new branch] gh/StrongerXi/134/orig -> origin/gh/StrongerXi/134/orig 2025-09-07T07:38:46.2947208Z * [new branch] gh/StrongerXi/136/base -> origin/gh/StrongerXi/136/base 2025-09-07T07:38:46.2947629Z * [new branch] gh/StrongerXi/136/head -> origin/gh/StrongerXi/136/head 2025-09-07T07:38:46.2948176Z * [new branch] gh/StrongerXi/136/orig -> origin/gh/StrongerXi/136/orig 2025-09-07T07:38:46.2949083Z * [new branch] gh/StrongerXi/137/base -> origin/gh/StrongerXi/137/base 2025-09-07T07:38:46.2949455Z * [new branch] gh/StrongerXi/137/head -> origin/gh/StrongerXi/137/head 2025-09-07T07:38:46.2950225Z * [new branch] gh/StrongerXi/137/orig -> origin/gh/StrongerXi/137/orig 2025-09-07T07:38:46.2950888Z * [new branch] gh/StrongerXi/138/base -> origin/gh/StrongerXi/138/base 2025-09-07T07:38:46.2951385Z * [new branch] gh/StrongerXi/138/head -> origin/gh/StrongerXi/138/head 2025-09-07T07:38:46.2951933Z * [new branch] gh/StrongerXi/138/orig -> origin/gh/StrongerXi/138/orig 2025-09-07T07:38:46.2952750Z * [new branch] gh/StrongerXi/139/base -> origin/gh/StrongerXi/139/base 2025-09-07T07:38:46.2953169Z * [new branch] gh/StrongerXi/139/head -> origin/gh/StrongerXi/139/head 2025-09-07T07:38:46.2953768Z * [new branch] gh/StrongerXi/139/orig -> origin/gh/StrongerXi/139/orig 2025-09-07T07:38:46.2954589Z * [new branch] gh/StrongerXi/140/base -> origin/gh/StrongerXi/140/base 2025-09-07T07:38:46.2955012Z * [new branch] gh/StrongerXi/140/head -> origin/gh/StrongerXi/140/head 2025-09-07T07:38:46.2955799Z * [new branch] gh/StrongerXi/140/orig -> origin/gh/StrongerXi/140/orig 2025-09-07T07:38:46.2956441Z * [new branch] gh/StrongerXi/71/base -> origin/gh/StrongerXi/71/base 2025-09-07T07:38:46.2956971Z * [new branch] gh/StrongerXi/71/head -> origin/gh/StrongerXi/71/head 2025-09-07T07:38:46.2957739Z * [new branch] gh/StrongerXi/72/base -> origin/gh/StrongerXi/72/base 2025-09-07T07:38:46.2958187Z * [new branch] gh/StrongerXi/72/head -> origin/gh/StrongerXi/72/head 2025-09-07T07:38:46.2959289Z * [new branch] gh/XilunWu/133/base -> origin/gh/XilunWu/133/base 2025-09-07T07:38:46.2959815Z * [new branch] gh/XilunWu/133/head -> origin/gh/XilunWu/133/head 2025-09-07T07:38:46.2960426Z * [new branch] gh/XilunWu/133/orig -> origin/gh/XilunWu/133/orig 2025-09-07T07:38:46.2961220Z * [new branch] gh/XilunWu/139/base -> origin/gh/XilunWu/139/base 2025-09-07T07:38:46.2961706Z * [new branch] gh/XilunWu/139/head -> origin/gh/XilunWu/139/head 2025-09-07T07:38:46.2962125Z * [new branch] gh/XilunWu/139/orig -> origin/gh/XilunWu/139/orig 2025-09-07T07:38:46.2963097Z * [new branch] gh/XilunWu/143/base -> origin/gh/XilunWu/143/base 2025-09-07T07:38:46.2963629Z * [new branch] gh/XilunWu/143/head -> origin/gh/XilunWu/143/head 2025-09-07T07:38:46.2964132Z * [new branch] gh/XilunWu/143/orig -> origin/gh/XilunWu/143/orig 2025-09-07T07:38:46.2965078Z * [new branch] gh/XilunWu/144/base -> origin/gh/XilunWu/144/base 2025-09-07T07:38:46.2965496Z * [new branch] gh/XilunWu/144/head -> origin/gh/XilunWu/144/head 2025-09-07T07:38:46.2966043Z * [new branch] gh/XilunWu/144/orig -> origin/gh/XilunWu/144/orig 2025-09-07T07:38:46.2966899Z * [new branch] gh/XilunWu/145/base -> origin/gh/XilunWu/145/base 2025-09-07T07:38:46.2967305Z * [new branch] gh/XilunWu/145/head -> origin/gh/XilunWu/145/head 2025-09-07T07:38:46.2967786Z * [new branch] gh/XilunWu/145/orig -> origin/gh/XilunWu/145/orig 2025-09-07T07:38:46.2968606Z * [new branch] gh/XilunWu/146/base -> origin/gh/XilunWu/146/base 2025-09-07T07:38:46.2969099Z * [new branch] gh/XilunWu/146/head -> origin/gh/XilunWu/146/head 2025-09-07T07:38:46.2969596Z * [new branch] gh/XilunWu/146/orig -> origin/gh/XilunWu/146/orig 2025-09-07T07:38:46.2970465Z * [new branch] gh/XilunWu/147/base -> origin/gh/XilunWu/147/base 2025-09-07T07:38:46.2970939Z * [new branch] gh/XilunWu/147/head -> origin/gh/XilunWu/147/head 2025-09-07T07:38:46.2971465Z * [new branch] gh/XilunWu/147/orig -> origin/gh/XilunWu/147/orig 2025-09-07T07:38:46.2972220Z * [new branch] gh/XilunWu/148/base -> origin/gh/XilunWu/148/base 2025-09-07T07:38:46.2972669Z * [new branch] gh/XilunWu/148/head -> origin/gh/XilunWu/148/head 2025-09-07T07:38:46.2973186Z * [new branch] gh/XilunWu/148/orig -> origin/gh/XilunWu/148/orig 2025-09-07T07:38:46.2973871Z * [new branch] gh/XilunWu/149/base -> origin/gh/XilunWu/149/base 2025-09-07T07:38:46.2974352Z * [new branch] gh/XilunWu/149/head -> origin/gh/XilunWu/149/head 2025-09-07T07:38:46.2974879Z * [new branch] gh/XilunWu/149/orig -> origin/gh/XilunWu/149/orig 2025-09-07T07:38:46.2975625Z * [new branch] gh/XilunWu/150/base -> origin/gh/XilunWu/150/base 2025-09-07T07:38:46.2976071Z * [new branch] gh/XilunWu/150/head -> origin/gh/XilunWu/150/head 2025-09-07T07:38:46.2976623Z * [new branch] gh/XilunWu/150/orig -> origin/gh/XilunWu/150/orig 2025-09-07T07:38:46.2977544Z * [new branch] gh/XilunWu/151/base -> origin/gh/XilunWu/151/base 2025-09-07T07:38:46.2978122Z * [new branch] gh/XilunWu/151/head -> origin/gh/XilunWu/151/head 2025-09-07T07:38:46.2978685Z * [new branch] gh/XilunWu/151/orig -> origin/gh/XilunWu/151/orig 2025-09-07T07:38:46.2979528Z * [new branch] gh/XilunWu/152/base -> origin/gh/XilunWu/152/base 2025-09-07T07:38:46.2979843Z * [new branch] gh/XilunWu/152/head -> origin/gh/XilunWu/152/head 2025-09-07T07:38:46.2980376Z * [new branch] gh/XilunWu/152/orig -> origin/gh/XilunWu/152/orig 2025-09-07T07:38:46.2981286Z * [new branch] gh/XilunWu/153/base -> origin/gh/XilunWu/153/base 2025-09-07T07:38:46.2981897Z * [new branch] gh/XilunWu/153/head -> origin/gh/XilunWu/153/head 2025-09-07T07:38:46.2982396Z * [new branch] gh/XilunWu/153/orig -> origin/gh/XilunWu/153/orig 2025-09-07T07:38:46.2983322Z * [new branch] gh/XilunWu/160/base -> origin/gh/XilunWu/160/base 2025-09-07T07:38:46.2983727Z * [new branch] gh/XilunWu/160/head -> origin/gh/XilunWu/160/head 2025-09-07T07:38:46.2984310Z * [new branch] gh/XilunWu/160/orig -> origin/gh/XilunWu/160/orig 2025-09-07T07:38:46.2985302Z * [new branch] gh/XilunWu/161/base -> origin/gh/XilunWu/161/base 2025-09-07T07:38:46.2985757Z * [new branch] gh/XilunWu/161/head -> origin/gh/XilunWu/161/head 2025-09-07T07:38:46.2986278Z * [new branch] gh/XilunWu/161/orig -> origin/gh/XilunWu/161/orig 2025-09-07T07:38:46.2987312Z * [new branch] gh/XilunWu/163/base -> origin/gh/XilunWu/163/base 2025-09-07T07:38:46.2987749Z * [new branch] gh/XilunWu/163/head -> origin/gh/XilunWu/163/head 2025-09-07T07:38:46.2988284Z * [new branch] gh/XilunWu/163/orig -> origin/gh/XilunWu/163/orig 2025-09-07T07:38:46.2989428Z * [new branch] gh/XilunWu/164/base -> origin/gh/XilunWu/164/base 2025-09-07T07:38:46.2989952Z * [new branch] gh/XilunWu/164/head -> origin/gh/XilunWu/164/head 2025-09-07T07:38:46.2990440Z * [new branch] gh/XilunWu/164/orig -> origin/gh/XilunWu/164/orig 2025-09-07T07:38:46.2991404Z * [new branch] gh/XilunWu/165/base -> origin/gh/XilunWu/165/base 2025-09-07T07:38:46.2991964Z * [new branch] gh/XilunWu/165/head -> origin/gh/XilunWu/165/head 2025-09-07T07:38:46.2992500Z * [new branch] gh/XilunWu/165/orig -> origin/gh/XilunWu/165/orig 2025-09-07T07:38:46.2993453Z * [new branch] gh/XilunWu/166/base -> origin/gh/XilunWu/166/base 2025-09-07T07:38:46.2993936Z * [new branch] gh/XilunWu/166/head -> origin/gh/XilunWu/166/head 2025-09-07T07:38:46.2994486Z * [new branch] gh/XilunWu/166/orig -> origin/gh/XilunWu/166/orig 2025-09-07T07:38:46.2995317Z * [new branch] gh/XilunWu/167/base -> origin/gh/XilunWu/167/base 2025-09-07T07:38:46.2995793Z * [new branch] gh/XilunWu/167/head -> origin/gh/XilunWu/167/head 2025-09-07T07:38:46.2996316Z * [new branch] gh/XilunWu/167/orig -> origin/gh/XilunWu/167/orig 2025-09-07T07:38:46.2997363Z * [new branch] gh/XilunWu/168/base -> origin/gh/XilunWu/168/base 2025-09-07T07:38:46.2997758Z * [new branch] gh/XilunWu/168/head -> origin/gh/XilunWu/168/head 2025-09-07T07:38:46.2998265Z * [new branch] gh/XilunWu/168/orig -> origin/gh/XilunWu/168/orig 2025-09-07T07:38:46.2999179Z * [new branch] gh/XilunWu/169/base -> origin/gh/XilunWu/169/base 2025-09-07T07:38:46.2999626Z * [new branch] gh/XilunWu/169/head -> origin/gh/XilunWu/169/head 2025-09-07T07:38:46.3000161Z * [new branch] gh/XilunWu/169/orig -> origin/gh/XilunWu/169/orig 2025-09-07T07:38:46.3000911Z * [new branch] gh/XilunWu/170/base -> origin/gh/XilunWu/170/base 2025-09-07T07:38:46.3001367Z * [new branch] gh/XilunWu/170/head -> origin/gh/XilunWu/170/head 2025-09-07T07:38:46.3001901Z * [new branch] gh/XilunWu/170/orig -> origin/gh/XilunWu/170/orig 2025-09-07T07:38:46.3003033Z * [new branch] gh/XuehaiPan/14/base -> origin/gh/XuehaiPan/14/base 2025-09-07T07:38:46.3003532Z * [new branch] gh/XuehaiPan/14/head -> origin/gh/XuehaiPan/14/head 2025-09-07T07:38:46.3003983Z * [new branch] gh/XuehaiPan/14/orig -> origin/gh/XuehaiPan/14/orig 2025-09-07T07:38:46.3004822Z * [new branch] gh/XuehaiPan/179/base -> origin/gh/XuehaiPan/179/base 2025-09-07T07:38:46.3005290Z * [new branch] gh/XuehaiPan/179/head -> origin/gh/XuehaiPan/179/head 2025-09-07T07:38:46.3005858Z * [new branch] gh/XuehaiPan/179/orig -> origin/gh/XuehaiPan/179/orig 2025-09-07T07:38:46.3006860Z * [new branch] gh/XuehaiPan/189/base -> origin/gh/XuehaiPan/189/base 2025-09-07T07:38:46.3007305Z * [new branch] gh/XuehaiPan/189/head -> origin/gh/XuehaiPan/189/head 2025-09-07T07:38:46.3007845Z * [new branch] gh/XuehaiPan/189/orig -> origin/gh/XuehaiPan/189/orig 2025-09-07T07:38:46.3008698Z * [new branch] gh/XuehaiPan/232/base -> origin/gh/XuehaiPan/232/base 2025-09-07T07:38:46.3009093Z * [new branch] gh/XuehaiPan/232/head -> origin/gh/XuehaiPan/232/head 2025-09-07T07:38:46.3009620Z * [new branch] gh/XuehaiPan/232/orig -> origin/gh/XuehaiPan/232/orig 2025-09-07T07:38:46.3010417Z * [new branch] gh/XuehaiPan/249/base -> origin/gh/XuehaiPan/249/base 2025-09-07T07:38:46.3010961Z * [new branch] gh/XuehaiPan/249/head -> origin/gh/XuehaiPan/249/head 2025-09-07T07:38:46.3011445Z * [new branch] gh/XuehaiPan/249/orig -> origin/gh/XuehaiPan/249/orig 2025-09-07T07:38:46.3012211Z * [new branch] gh/XuehaiPan/253/base -> origin/gh/XuehaiPan/253/base 2025-09-07T07:38:46.3012658Z * [new branch] gh/XuehaiPan/253/head -> origin/gh/XuehaiPan/253/head 2025-09-07T07:38:46.3013213Z * [new branch] gh/XuehaiPan/253/orig -> origin/gh/XuehaiPan/253/orig 2025-09-07T07:38:46.3013968Z * [new branch] gh/XuehaiPan/254/base -> origin/gh/XuehaiPan/254/base 2025-09-07T07:38:46.3014398Z * [new branch] gh/XuehaiPan/254/head -> origin/gh/XuehaiPan/254/head 2025-09-07T07:38:46.3015000Z * [new branch] gh/XuehaiPan/254/orig -> origin/gh/XuehaiPan/254/orig 2025-09-07T07:38:46.3015940Z * [new branch] gh/XuehaiPan/255/base -> origin/gh/XuehaiPan/255/base 2025-09-07T07:38:46.3016389Z * [new branch] gh/XuehaiPan/255/head -> origin/gh/XuehaiPan/255/head 2025-09-07T07:38:46.3016858Z * [new branch] gh/XuehaiPan/255/orig -> origin/gh/XuehaiPan/255/orig 2025-09-07T07:38:46.3017692Z * [new branch] gh/XuehaiPan/257/base -> origin/gh/XuehaiPan/257/base 2025-09-07T07:38:46.3018127Z * [new branch] gh/XuehaiPan/257/head -> origin/gh/XuehaiPan/257/head 2025-09-07T07:38:46.3018631Z * [new branch] gh/XuehaiPan/257/orig -> origin/gh/XuehaiPan/257/orig 2025-09-07T07:38:46.3019456Z * [new branch] gh/XuehaiPan/271/base -> origin/gh/XuehaiPan/271/base 2025-09-07T07:38:46.3019851Z * [new branch] gh/XuehaiPan/271/head -> origin/gh/XuehaiPan/271/head 2025-09-07T07:38:46.3020366Z * [new branch] gh/XuehaiPan/271/orig -> origin/gh/XuehaiPan/271/orig 2025-09-07T07:38:46.3021186Z * [new branch] gh/XuehaiPan/290/base -> origin/gh/XuehaiPan/290/base 2025-09-07T07:38:46.3021708Z * [new branch] gh/XuehaiPan/290/head -> origin/gh/XuehaiPan/290/head 2025-09-07T07:38:46.3022172Z * [new branch] gh/XuehaiPan/290/orig -> origin/gh/XuehaiPan/290/orig 2025-09-07T07:38:46.3023020Z * [new branch] gh/XuehaiPan/343/base -> origin/gh/XuehaiPan/343/base 2025-09-07T07:38:46.3023474Z * [new branch] gh/XuehaiPan/343/head -> origin/gh/XuehaiPan/343/head 2025-09-07T07:38:46.3024013Z * [new branch] gh/XuehaiPan/343/orig -> origin/gh/XuehaiPan/343/orig 2025-09-07T07:38:46.3024961Z * [new branch] gh/XuehaiPan/347/base -> origin/gh/XuehaiPan/347/base 2025-09-07T07:38:46.3025417Z * [new branch] gh/XuehaiPan/347/head -> origin/gh/XuehaiPan/347/head 2025-09-07T07:38:46.3025985Z * [new branch] gh/XuehaiPan/347/orig -> origin/gh/XuehaiPan/347/orig 2025-09-07T07:38:46.3026778Z * [new branch] gh/XuehaiPan/348/base -> origin/gh/XuehaiPan/348/base 2025-09-07T07:38:46.3036861Z * [new branch] gh/XuehaiPan/348/head -> origin/gh/XuehaiPan/348/head 2025-09-07T07:38:46.3037207Z * [new branch] gh/XuehaiPan/348/orig -> origin/gh/XuehaiPan/348/orig 2025-09-07T07:38:46.3037517Z * [new branch] gh/XuehaiPan/350/base -> origin/gh/XuehaiPan/350/base 2025-09-07T07:38:46.3037821Z * [new branch] gh/XuehaiPan/350/head -> origin/gh/XuehaiPan/350/head 2025-09-07T07:38:46.3038136Z * [new branch] gh/XuehaiPan/350/orig -> origin/gh/XuehaiPan/350/orig 2025-09-07T07:38:46.3038445Z * [new branch] gh/XuehaiPan/356/base -> origin/gh/XuehaiPan/356/base 2025-09-07T07:38:46.3038758Z * [new branch] gh/XuehaiPan/356/head -> origin/gh/XuehaiPan/356/head 2025-09-07T07:38:46.3039072Z * [new branch] gh/XuehaiPan/356/orig -> origin/gh/XuehaiPan/356/orig 2025-09-07T07:38:46.3039371Z * [new branch] gh/XuehaiPan/357/base -> origin/gh/XuehaiPan/357/base 2025-09-07T07:38:46.3039675Z * [new branch] gh/XuehaiPan/357/head -> origin/gh/XuehaiPan/357/head 2025-09-07T07:38:46.3039984Z * [new branch] gh/XuehaiPan/357/orig -> origin/gh/XuehaiPan/357/orig 2025-09-07T07:38:46.3040287Z * [new branch] gh/XuehaiPan/358/base -> origin/gh/XuehaiPan/358/base 2025-09-07T07:38:46.3040592Z * [new branch] gh/XuehaiPan/358/head -> origin/gh/XuehaiPan/358/head 2025-09-07T07:38:46.3040890Z * [new branch] gh/XuehaiPan/358/orig -> origin/gh/XuehaiPan/358/orig 2025-09-07T07:38:46.3041202Z * [new branch] gh/XuehaiPan/359/base -> origin/gh/XuehaiPan/359/base 2025-09-07T07:38:46.3041554Z * [new branch] gh/XuehaiPan/359/head -> origin/gh/XuehaiPan/359/head 2025-09-07T07:38:46.3041864Z * [new branch] gh/XuehaiPan/359/orig -> origin/gh/XuehaiPan/359/orig 2025-09-07T07:38:46.3042180Z * [new branch] gh/XuehaiPan/360/base -> origin/gh/XuehaiPan/360/base 2025-09-07T07:38:46.3042478Z * [new branch] gh/XuehaiPan/360/head -> origin/gh/XuehaiPan/360/head 2025-09-07T07:38:46.3042783Z * [new branch] gh/XuehaiPan/360/orig -> origin/gh/XuehaiPan/360/orig 2025-09-07T07:38:46.3043089Z * [new branch] gh/XuehaiPan/365/base -> origin/gh/XuehaiPan/365/base 2025-09-07T07:38:46.3043393Z * [new branch] gh/XuehaiPan/365/head -> origin/gh/XuehaiPan/365/head 2025-09-07T07:38:46.3043701Z * [new branch] gh/XuehaiPan/365/orig -> origin/gh/XuehaiPan/365/orig 2025-09-07T07:38:46.3043999Z * [new branch] gh/XuehaiPan/366/base -> origin/gh/XuehaiPan/366/base 2025-09-07T07:38:46.3044306Z * [new branch] gh/XuehaiPan/366/head -> origin/gh/XuehaiPan/366/head 2025-09-07T07:38:46.3044616Z * [new branch] gh/XuehaiPan/369/base -> origin/gh/XuehaiPan/369/base 2025-09-07T07:38:46.3044917Z * [new branch] gh/XuehaiPan/369/head -> origin/gh/XuehaiPan/369/head 2025-09-07T07:38:46.3045224Z * [new branch] gh/XuehaiPan/369/orig -> origin/gh/XuehaiPan/369/orig 2025-09-07T07:38:46.3045519Z * [new branch] gh/XuehaiPan/370/base -> origin/gh/XuehaiPan/370/base 2025-09-07T07:38:46.3045831Z * [new branch] gh/XuehaiPan/370/head -> origin/gh/XuehaiPan/370/head 2025-09-07T07:38:46.3046138Z * [new branch] gh/XuehaiPan/370/orig -> origin/gh/XuehaiPan/370/orig 2025-09-07T07:38:46.3046470Z * [new branch] gh/XuehaiPan/380/base -> origin/gh/XuehaiPan/380/base 2025-09-07T07:38:46.3046976Z * [new branch] gh/XuehaiPan/380/head -> origin/gh/XuehaiPan/380/head 2025-09-07T07:38:46.3047735Z * [new branch] gh/XuehaiPan/380/orig -> origin/gh/XuehaiPan/380/orig 2025-09-07T07:38:46.3048287Z * [new branch] gh/XuehaiPan/381/base -> origin/gh/XuehaiPan/381/base 2025-09-07T07:38:46.3048770Z * [new branch] gh/XuehaiPan/381/head -> origin/gh/XuehaiPan/381/head 2025-09-07T07:38:46.3049632Z * [new branch] gh/XuehaiPan/382/base -> origin/gh/XuehaiPan/382/base 2025-09-07T07:38:46.3050095Z * [new branch] gh/XuehaiPan/382/head -> origin/gh/XuehaiPan/382/head 2025-09-07T07:38:46.3050616Z * [new branch] gh/XuehaiPan/382/orig -> origin/gh/XuehaiPan/382/orig 2025-09-07T07:38:46.3051480Z * [new branch] gh/XuehaiPan/383/base -> origin/gh/XuehaiPan/383/base 2025-09-07T07:38:46.3052076Z * [new branch] gh/XuehaiPan/383/head -> origin/gh/XuehaiPan/383/head 2025-09-07T07:38:46.3052638Z * [new branch] gh/XuehaiPan/383/orig -> origin/gh/XuehaiPan/383/orig 2025-09-07T07:38:46.3053432Z * [new branch] gh/XuehaiPan/384/base -> origin/gh/XuehaiPan/384/base 2025-09-07T07:38:46.3053907Z * [new branch] gh/XuehaiPan/384/head -> origin/gh/XuehaiPan/384/head 2025-09-07T07:38:46.3054395Z * [new branch] gh/XuehaiPan/384/orig -> origin/gh/XuehaiPan/384/orig 2025-09-07T07:38:46.3055288Z * [new branch] gh/XuehaiPan/385/base -> origin/gh/XuehaiPan/385/base 2025-09-07T07:38:46.3055712Z * [new branch] gh/XuehaiPan/385/head -> origin/gh/XuehaiPan/385/head 2025-09-07T07:38:46.3056199Z * [new branch] gh/XuehaiPan/385/orig -> origin/gh/XuehaiPan/385/orig 2025-09-07T07:38:46.3056965Z * [new branch] gh/XuehaiPan/386/base -> origin/gh/XuehaiPan/386/base 2025-09-07T07:38:46.3057396Z * [new branch] gh/XuehaiPan/386/head -> origin/gh/XuehaiPan/386/head 2025-09-07T07:38:46.3057928Z * [new branch] gh/XuehaiPan/386/orig -> origin/gh/XuehaiPan/386/orig 2025-09-07T07:38:46.3058752Z * [new branch] gh/XuehaiPan/387/base -> origin/gh/XuehaiPan/387/base 2025-09-07T07:38:46.3059152Z * [new branch] gh/XuehaiPan/387/head -> origin/gh/XuehaiPan/387/head 2025-09-07T07:38:46.3059702Z * [new branch] gh/XuehaiPan/387/orig -> origin/gh/XuehaiPan/387/orig 2025-09-07T07:38:46.3060765Z * [new branch] gh/ZainRizvi/1/base -> origin/gh/ZainRizvi/1/base 2025-09-07T07:38:46.3061364Z * [new branch] gh/ZainRizvi/1/head -> origin/gh/ZainRizvi/1/head 2025-09-07T07:38:46.3062117Z * [new branch] gh/ZainRizvi/2/base -> origin/gh/ZainRizvi/2/base 2025-09-07T07:38:46.3062526Z * [new branch] gh/ZainRizvi/2/head -> origin/gh/ZainRizvi/2/head 2025-09-07T07:38:46.3063282Z * [new branch] gh/ZainRizvi/3/base -> origin/gh/ZainRizvi/3/base 2025-09-07T07:38:46.3063684Z * [new branch] gh/ZainRizvi/3/head -> origin/gh/ZainRizvi/3/head 2025-09-07T07:38:46.3064543Z * [new branch] gh/ZainRizvi/4/base -> origin/gh/ZainRizvi/4/base 2025-09-07T07:38:46.3064987Z * [new branch] gh/ZainRizvi/4/head -> origin/gh/ZainRizvi/4/head 2025-09-07T07:38:46.3066113Z * [new branch] gh/ZainRizvi/5/base -> origin/gh/ZainRizvi/5/base 2025-09-07T07:38:46.3066486Z * [new branch] gh/ZainRizvi/5/head -> origin/gh/ZainRizvi/5/head 2025-09-07T07:38:46.3067277Z * [new branch] gh/ZainRizvi/6/base -> origin/gh/ZainRizvi/6/base 2025-09-07T07:38:46.3067712Z * [new branch] gh/ZainRizvi/6/head -> origin/gh/ZainRizvi/6/head 2025-09-07T07:38:46.3068246Z * [new branch] gh/ZainRizvi/6/orig -> origin/gh/ZainRizvi/6/orig 2025-09-07T07:38:46.3069079Z * [new branch] gh/ZainRizvi/7/base -> origin/gh/ZainRizvi/7/base 2025-09-07T07:38:46.3069530Z * [new branch] gh/ZainRizvi/7/head -> origin/gh/ZainRizvi/7/head 2025-09-07T07:38:46.3069988Z * [new branch] gh/ZainRizvi/7/orig -> origin/gh/ZainRizvi/7/orig 2025-09-07T07:38:46.3071041Z * [new branch] gh/ZainRizvi/8/base -> origin/gh/ZainRizvi/8/base 2025-09-07T07:38:46.3071503Z * [new branch] gh/ZainRizvi/8/head -> origin/gh/ZainRizvi/8/head 2025-09-07T07:38:46.3072328Z * [new branch] gh/ZainRizvi/9/base -> origin/gh/ZainRizvi/9/base 2025-09-07T07:38:46.3072783Z * [new branch] gh/ZainRizvi/9/head -> origin/gh/ZainRizvi/9/head 2025-09-07T07:38:46.3073337Z * [new branch] gh/ZainRizvi/9/orig -> origin/gh/ZainRizvi/9/orig 2025-09-07T07:38:46.3074321Z * [new branch] gh/ZhiweiYan-96/39/base -> origin/gh/ZhiweiYan-96/39/base 2025-09-07T07:38:46.3074787Z * [new branch] gh/ZhiweiYan-96/39/head -> origin/gh/ZhiweiYan-96/39/head 2025-09-07T07:38:46.3075397Z * [new branch] gh/ZhiweiYan-96/39/orig -> origin/gh/ZhiweiYan-96/39/orig 2025-09-07T07:38:46.3076262Z * [new branch] gh/ZhiweiYan-96/44/base -> origin/gh/ZhiweiYan-96/44/base 2025-09-07T07:38:46.3076734Z * [new branch] gh/ZhiweiYan-96/44/head -> origin/gh/ZhiweiYan-96/44/head 2025-09-07T07:38:46.3077573Z * [new branch] gh/ZhiweiYan-96/45/base -> origin/gh/ZhiweiYan-96/45/base 2025-09-07T07:38:46.3077920Z * [new branch] gh/ZhiweiYan-96/45/head -> origin/gh/ZhiweiYan-96/45/head 2025-09-07T07:38:46.3078784Z * [new branch] gh/ZhiweiYan-96/49/base -> origin/gh/ZhiweiYan-96/49/base 2025-09-07T07:38:46.3079212Z * [new branch] gh/ZhiweiYan-96/49/head -> origin/gh/ZhiweiYan-96/49/head 2025-09-07T07:38:46.3080244Z * [new branch] gh/ZhiweiYan-96/62/base -> origin/gh/ZhiweiYan-96/62/base 2025-09-07T07:38:46.3080706Z * [new branch] gh/ZhiweiYan-96/62/head -> origin/gh/ZhiweiYan-96/62/head 2025-09-07T07:38:46.3081639Z * [new branch] gh/ZhiweiYan-96/64/base -> origin/gh/ZhiweiYan-96/64/base 2025-09-07T07:38:46.3082057Z * [new branch] gh/ZhiweiYan-96/64/head -> origin/gh/ZhiweiYan-96/64/head 2025-09-07T07:38:46.3082650Z * [new branch] gh/ZhiweiYan-96/64/orig -> origin/gh/ZhiweiYan-96/64/orig 2025-09-07T07:38:46.3083455Z * [new branch] gh/ZhiweiYan-96/65/base -> origin/gh/ZhiweiYan-96/65/base 2025-09-07T07:38:46.3083912Z * [new branch] gh/ZhiweiYan-96/65/head -> origin/gh/ZhiweiYan-96/65/head 2025-09-07T07:38:46.3084448Z * [new branch] gh/ZhiweiYan-96/65/orig -> origin/gh/ZhiweiYan-96/65/orig 2025-09-07T07:38:46.3085260Z * [new branch] gh/ZhiweiYan-96/66/base -> origin/gh/ZhiweiYan-96/66/base 2025-09-07T07:38:46.3085706Z * [new branch] gh/ZhiweiYan-96/66/head -> origin/gh/ZhiweiYan-96/66/head 2025-09-07T07:38:46.3086459Z * [new branch] gh/ZhiweiYan-96/67/base -> origin/gh/ZhiweiYan-96/67/base 2025-09-07T07:38:46.3086855Z * [new branch] gh/ZhiweiYan-96/67/head -> origin/gh/ZhiweiYan-96/67/head 2025-09-07T07:38:46.3087626Z * [new branch] gh/ZhiweiYan-96/68/base -> origin/gh/ZhiweiYan-96/68/base 2025-09-07T07:38:46.3088020Z * [new branch] gh/ZhiweiYan-96/68/head -> origin/gh/ZhiweiYan-96/68/head 2025-09-07T07:38:46.3088572Z * [new branch] gh/ZhiweiYan-96/68/orig -> origin/gh/ZhiweiYan-96/68/orig 2025-09-07T07:38:46.3089811Z * [new branch] gh/aakhundov/1/base -> origin/gh/aakhundov/1/base 2025-09-07T07:38:46.3090330Z * [new branch] gh/aakhundov/1/head -> origin/gh/aakhundov/1/head 2025-09-07T07:38:46.3091082Z * [new branch] gh/aakhundov/2/base -> origin/gh/aakhundov/2/base 2025-09-07T07:38:46.3091552Z * [new branch] gh/aakhundov/2/head -> origin/gh/aakhundov/2/head 2025-09-07T07:38:46.3092487Z * [new branch] gh/aditew01/openblas -> origin/gh/aditew01/openblas 2025-09-07T07:38:46.3092816Z * [new branch] gh/aditew01/sbgemm -> origin/gh/aditew01/sbgemm 2025-09-07T07:38:46.3093379Z * [new branch] gh/aditew01/vecbf16 -> origin/gh/aditew01/vecbf16 2025-09-07T07:38:46.3094371Z * [new branch] gh/alexbrauckmann/paddedtensor_faketensor_init -> origin/gh/alexbrauckmann/paddedtensor_faketensor_init 2025-09-07T07:38:46.3095090Z * [new branch] gh/alexsamardzic/9/base -> origin/gh/alexsamardzic/9/base 2025-09-07T07:38:46.3095602Z * [new branch] gh/alexsamardzic/9/head -> origin/gh/alexsamardzic/9/head 2025-09-07T07:38:46.3096214Z * [new branch] gh/alexsamardzic/9/orig -> origin/gh/alexsamardzic/9/orig 2025-09-07T07:38:46.3097215Z * [new branch] gh/amjames/18/base -> origin/gh/amjames/18/base 2025-09-07T07:38:46.3097681Z * [new branch] gh/amjames/18/head -> origin/gh/amjames/18/head 2025-09-07T07:38:46.3098193Z * [new branch] gh/amjames/18/orig -> origin/gh/amjames/18/orig 2025-09-07T07:38:46.3099441Z * [new branch] gh/andrewor14/35/base -> origin/gh/andrewor14/35/base 2025-09-07T07:38:46.3100080Z * [new branch] gh/andrewor14/35/head -> origin/gh/andrewor14/35/head 2025-09-07T07:38:46.3100651Z * [new branch] gh/andrewor14/35/orig -> origin/gh/andrewor14/35/orig 2025-09-07T07:38:46.3101574Z * [new branch] gh/andrewor14/50/base -> origin/gh/andrewor14/50/base 2025-09-07T07:38:46.3102530Z * [new branch] gh/andrewor14/50/head -> origin/gh/andrewor14/50/head 2025-09-07T07:38:46.3103033Z * [new branch] gh/andrewor14/50/orig -> origin/gh/andrewor14/50/orig 2025-09-07T07:38:46.3103920Z * [new branch] gh/andrewor14/51/base -> origin/gh/andrewor14/51/base 2025-09-07T07:38:46.3104422Z * [new branch] gh/andrewor14/51/orig -> origin/gh/andrewor14/51/orig 2025-09-07T07:38:46.3105593Z * [new branch] gh/andyanwang/1/base -> origin/gh/andyanwang/1/base 2025-09-07T07:38:46.3105997Z * [new branch] gh/andyanwang/1/head -> origin/gh/andyanwang/1/head 2025-09-07T07:38:46.3106533Z * [new branch] gh/andyanwang/1/orig -> origin/gh/andyanwang/1/orig 2025-09-07T07:38:46.3107479Z * [new branch] gh/andyanwang/13/base -> origin/gh/andyanwang/13/base 2025-09-07T07:38:46.3107957Z * [new branch] gh/andyanwang/13/head -> origin/gh/andyanwang/13/head 2025-09-07T07:38:46.3109060Z * [new branch] gh/andyanwang/13/orig -> origin/gh/andyanwang/13/orig 2025-09-07T07:38:46.3109826Z * [new branch] gh/andyanwang/2/base -> origin/gh/andyanwang/2/base 2025-09-07T07:38:46.3110304Z * [new branch] gh/andyanwang/2/head -> origin/gh/andyanwang/2/head 2025-09-07T07:38:46.3111074Z * [new branch] gh/andyanwang/2/orig -> origin/gh/andyanwang/2/orig 2025-09-07T07:38:46.3111922Z * [new branch] gh/andyanwang/28/base -> origin/gh/andyanwang/28/base 2025-09-07T07:38:46.3112440Z * [new branch] gh/andyanwang/28/head -> origin/gh/andyanwang/28/head 2025-09-07T07:38:46.3112969Z * [new branch] gh/andyanwang/28/orig -> origin/gh/andyanwang/28/orig 2025-09-07T07:38:46.3113667Z * [new branch] gh/andyanwang/3/base -> origin/gh/andyanwang/3/base 2025-09-07T07:38:46.3114183Z * [new branch] gh/andyanwang/3/head -> origin/gh/andyanwang/3/head 2025-09-07T07:38:46.3114752Z * [new branch] gh/andyanwang/3/orig -> origin/gh/andyanwang/3/orig 2025-09-07T07:38:46.3115633Z * [new branch] gh/andyanwang/30/base -> origin/gh/andyanwang/30/base 2025-09-07T07:38:46.3116289Z * [new branch] gh/andyanwang/30/orig -> origin/gh/andyanwang/30/orig 2025-09-07T07:38:46.3117159Z * [new branch] gh/andyanwang/31/base -> origin/gh/andyanwang/31/base 2025-09-07T07:38:46.3117790Z * [new branch] gh/andyanwang/31/orig -> origin/gh/andyanwang/31/orig 2025-09-07T07:38:46.3119002Z * [new branch] gh/andyanwang/32/base -> origin/gh/andyanwang/32/base 2025-09-07T07:38:46.3119499Z * [new branch] gh/andyanwang/32/head -> origin/gh/andyanwang/32/head 2025-09-07T07:38:46.3120127Z * [new branch] gh/andyanwang/32/orig -> origin/gh/andyanwang/32/orig 2025-09-07T07:38:46.3120998Z * [new branch] gh/andyanwang/39/base -> origin/gh/andyanwang/39/base 2025-09-07T07:38:46.3121651Z * [new branch] gh/andyanwang/39/head -> origin/gh/andyanwang/39/head 2025-09-07T07:38:46.3122209Z * [new branch] gh/andyanwang/39/orig -> origin/gh/andyanwang/39/orig 2025-09-07T07:38:46.3123093Z * [new branch] gh/andyanwang/4/base -> origin/gh/andyanwang/4/base 2025-09-07T07:38:46.3123553Z * [new branch] gh/andyanwang/4/head -> origin/gh/andyanwang/4/head 2025-09-07T07:38:46.3124152Z * [new branch] gh/andyanwang/4/orig -> origin/gh/andyanwang/4/orig 2025-09-07T07:38:46.3125161Z * [new branch] gh/angelayi/107/base -> origin/gh/angelayi/107/base 2025-09-07T07:38:46.3125639Z * [new branch] gh/angelayi/107/head -> origin/gh/angelayi/107/head 2025-09-07T07:38:46.3126446Z * [new branch] gh/angelayi/111/base -> origin/gh/angelayi/111/base 2025-09-07T07:38:46.3126907Z * [new branch] gh/angelayi/111/head -> origin/gh/angelayi/111/head 2025-09-07T07:38:46.3127428Z * [new branch] gh/angelayi/111/orig -> origin/gh/angelayi/111/orig 2025-09-07T07:38:46.3128265Z * [new branch] gh/angelayi/112/base -> origin/gh/angelayi/112/base 2025-09-07T07:38:46.3128753Z * [new branch] gh/angelayi/112/head -> origin/gh/angelayi/112/head 2025-09-07T07:38:46.3129304Z * [new branch] gh/angelayi/112/orig -> origin/gh/angelayi/112/orig 2025-09-07T07:38:46.3130236Z * [new branch] gh/angelayi/113/base -> origin/gh/angelayi/113/base 2025-09-07T07:38:46.3130778Z * [new branch] gh/angelayi/113/head -> origin/gh/angelayi/113/head 2025-09-07T07:38:46.3131285Z * [new branch] gh/angelayi/113/orig -> origin/gh/angelayi/113/orig 2025-09-07T07:38:46.3132130Z * [new branch] gh/angelayi/114/base -> origin/gh/angelayi/114/base 2025-09-07T07:38:46.3132552Z * [new branch] gh/angelayi/114/head -> origin/gh/angelayi/114/head 2025-09-07T07:38:46.3133062Z * [new branch] gh/angelayi/114/orig -> origin/gh/angelayi/114/orig 2025-09-07T07:38:46.3133966Z * [new branch] gh/angelayi/115/base -> origin/gh/angelayi/115/base 2025-09-07T07:38:46.3134491Z * [new branch] gh/angelayi/115/head -> origin/gh/angelayi/115/head 2025-09-07T07:38:46.3135028Z * [new branch] gh/angelayi/115/orig -> origin/gh/angelayi/115/orig 2025-09-07T07:38:46.3136167Z * [new branch] gh/anijain2305/753/base -> origin/gh/anijain2305/753/base 2025-09-07T07:38:46.3136618Z * [new branch] gh/anijain2305/753/head -> origin/gh/anijain2305/753/head 2025-09-07T07:38:46.3137196Z * [new branch] gh/anijain2305/753/orig -> origin/gh/anijain2305/753/orig 2025-09-07T07:38:46.3138095Z * [new branch] gh/anijain2305/766/base -> origin/gh/anijain2305/766/base 2025-09-07T07:38:46.3138484Z * [new branch] gh/anijain2305/766/head -> origin/gh/anijain2305/766/head 2025-09-07T07:38:46.3138990Z * [new branch] gh/anijain2305/766/orig -> origin/gh/anijain2305/766/orig 2025-09-07T07:38:46.3139880Z * [new branch] gh/anijain2305/790/base -> origin/gh/anijain2305/790/base 2025-09-07T07:38:46.3140438Z * [new branch] gh/anijain2305/790/head -> origin/gh/anijain2305/790/head 2025-09-07T07:38:46.3140996Z * [new branch] gh/anijain2305/790/orig -> origin/gh/anijain2305/790/orig 2025-09-07T07:38:46.3141799Z * [new branch] gh/anijain2305/792/base -> origin/gh/anijain2305/792/base 2025-09-07T07:38:46.3142227Z * [new branch] gh/anijain2305/792/head -> origin/gh/anijain2305/792/head 2025-09-07T07:38:46.3142743Z * [new branch] gh/anijain2305/792/orig -> origin/gh/anijain2305/792/orig 2025-09-07T07:38:46.3143586Z * [new branch] gh/anijain2305/803/base -> origin/gh/anijain2305/803/base 2025-09-07T07:38:46.3144062Z * [new branch] gh/anijain2305/803/head -> origin/gh/anijain2305/803/head 2025-09-07T07:38:46.3144575Z * [new branch] gh/anijain2305/803/orig -> origin/gh/anijain2305/803/orig 2025-09-07T07:38:46.3145421Z * [new branch] gh/anijain2305/804/base -> origin/gh/anijain2305/804/base 2025-09-07T07:38:46.3145803Z * [new branch] gh/anijain2305/804/head -> origin/gh/anijain2305/804/head 2025-09-07T07:38:46.3146342Z * [new branch] gh/anijain2305/804/orig -> origin/gh/anijain2305/804/orig 2025-09-07T07:38:46.3147198Z * [new branch] gh/anijain2305/805/base -> origin/gh/anijain2305/805/base 2025-09-07T07:38:46.3147649Z * [new branch] gh/anijain2305/805/head -> origin/gh/anijain2305/805/head 2025-09-07T07:38:46.3148192Z * [new branch] gh/anijain2305/805/orig -> origin/gh/anijain2305/805/orig 2025-09-07T07:38:46.3149119Z * [new branch] gh/anijain2305/810/base -> origin/gh/anijain2305/810/base 2025-09-07T07:38:46.3149734Z * [new branch] gh/anijain2305/810/head -> origin/gh/anijain2305/810/head 2025-09-07T07:38:46.3150288Z * [new branch] gh/anijain2305/810/orig -> origin/gh/anijain2305/810/orig 2025-09-07T07:38:46.3151154Z * [new branch] gh/anijain2305/812/base -> origin/gh/anijain2305/812/base 2025-09-07T07:38:46.3151635Z * [new branch] gh/anijain2305/812/head -> origin/gh/anijain2305/812/head 2025-09-07T07:38:46.3152145Z * [new branch] gh/anijain2305/812/orig -> origin/gh/anijain2305/812/orig 2025-09-07T07:38:46.3153008Z * [new branch] gh/anijain2305/838/base -> origin/gh/anijain2305/838/base 2025-09-07T07:38:46.3153501Z * [new branch] gh/anijain2305/838/head -> origin/gh/anijain2305/838/head 2025-09-07T07:38:46.3154016Z * [new branch] gh/anijain2305/838/orig -> origin/gh/anijain2305/838/orig 2025-09-07T07:38:46.3154829Z * [new branch] gh/anijain2305/839/base -> origin/gh/anijain2305/839/base 2025-09-07T07:38:46.3155291Z * [new branch] gh/anijain2305/839/head -> origin/gh/anijain2305/839/head 2025-09-07T07:38:46.3155820Z * [new branch] gh/anijain2305/839/orig -> origin/gh/anijain2305/839/orig 2025-09-07T07:38:46.3156654Z * [new branch] gh/anijain2305/843/base -> origin/gh/anijain2305/843/base 2025-09-07T07:38:46.3157097Z * [new branch] gh/anijain2305/843/head -> origin/gh/anijain2305/843/head 2025-09-07T07:38:46.3157600Z * [new branch] gh/anijain2305/843/orig -> origin/gh/anijain2305/843/orig 2025-09-07T07:38:46.3158455Z * [new branch] gh/anijain2305/844/base -> origin/gh/anijain2305/844/base 2025-09-07T07:38:46.3159074Z * [new branch] gh/anijain2305/844/head -> origin/gh/anijain2305/844/head 2025-09-07T07:38:46.3159597Z * [new branch] gh/anijain2305/844/orig -> origin/gh/anijain2305/844/orig 2025-09-07T07:38:46.3160470Z * [new branch] gh/anijain2305/846/base -> origin/gh/anijain2305/846/base 2025-09-07T07:38:46.3161405Z * [new branch] gh/anijain2305/846/head -> origin/gh/anijain2305/846/head 2025-09-07T07:38:46.3161816Z * [new branch] gh/anijain2305/846/orig -> origin/gh/anijain2305/846/orig 2025-09-07T07:38:46.3162752Z * [new branch] gh/anijain2305/848/base -> origin/gh/anijain2305/848/base 2025-09-07T07:38:46.3163353Z * [new branch] gh/anijain2305/848/head -> origin/gh/anijain2305/848/head 2025-09-07T07:38:46.3163869Z * [new branch] gh/anijain2305/848/orig -> origin/gh/anijain2305/848/orig 2025-09-07T07:38:46.3164707Z * [new branch] gh/anijain2305/849/base -> origin/gh/anijain2305/849/base 2025-09-07T07:38:46.3165161Z * [new branch] gh/anijain2305/849/head -> origin/gh/anijain2305/849/head 2025-09-07T07:38:46.3165698Z * [new branch] gh/anijain2305/849/orig -> origin/gh/anijain2305/849/orig 2025-09-07T07:38:46.3166741Z * [new branch] gh/anijain2305/850/base -> origin/gh/anijain2305/850/base 2025-09-07T07:38:46.3167185Z * [new branch] gh/anijain2305/850/head -> origin/gh/anijain2305/850/head 2025-09-07T07:38:46.3167718Z * [new branch] gh/anijain2305/850/orig -> origin/gh/anijain2305/850/orig 2025-09-07T07:38:46.3168586Z * [new branch] gh/anijain2305/851/base -> origin/gh/anijain2305/851/base 2025-09-07T07:38:46.3169235Z * [new branch] gh/anijain2305/851/head -> origin/gh/anijain2305/851/head 2025-09-07T07:38:46.3169753Z * [new branch] gh/anijain2305/851/orig -> origin/gh/anijain2305/851/orig 2025-09-07T07:38:46.3170672Z * [new branch] gh/anijain2305/852/base -> origin/gh/anijain2305/852/base 2025-09-07T07:38:46.3171140Z * [new branch] gh/anijain2305/852/head -> origin/gh/anijain2305/852/head 2025-09-07T07:38:46.3171684Z * [new branch] gh/anijain2305/852/orig -> origin/gh/anijain2305/852/orig 2025-09-07T07:38:46.3172468Z * [new branch] gh/anijain2305/853/base -> origin/gh/anijain2305/853/base 2025-09-07T07:38:46.3172889Z * [new branch] gh/anijain2305/853/head -> origin/gh/anijain2305/853/head 2025-09-07T07:38:46.3173389Z * [new branch] gh/anijain2305/853/orig -> origin/gh/anijain2305/853/orig 2025-09-07T07:38:46.3174222Z * [new branch] gh/anijain2305/854/base -> origin/gh/anijain2305/854/base 2025-09-07T07:38:46.3174738Z * [new branch] gh/anijain2305/854/head -> origin/gh/anijain2305/854/head 2025-09-07T07:38:46.3175270Z * [new branch] gh/anijain2305/854/orig -> origin/gh/anijain2305/854/orig 2025-09-07T07:38:46.3176103Z * [new branch] gh/anijain2305/855/base -> origin/gh/anijain2305/855/base 2025-09-07T07:38:46.3176563Z * [new branch] gh/anijain2305/855/head -> origin/gh/anijain2305/855/head 2025-09-07T07:38:46.3177098Z * [new branch] gh/anijain2305/855/orig -> origin/gh/anijain2305/855/orig 2025-09-07T07:38:46.3178149Z * [new branch] gh/anijain2305/856/base -> origin/gh/anijain2305/856/base 2025-09-07T07:38:46.3178608Z * [new branch] gh/anijain2305/856/head -> origin/gh/anijain2305/856/head 2025-09-07T07:38:46.3179127Z * [new branch] gh/anijain2305/856/orig -> origin/gh/anijain2305/856/orig 2025-09-07T07:38:46.3179973Z * [new branch] gh/anijain2305/857/base -> origin/gh/anijain2305/857/base 2025-09-07T07:38:46.3180371Z * [new branch] gh/anijain2305/857/head -> origin/gh/anijain2305/857/head 2025-09-07T07:38:46.3180896Z * [new branch] gh/anijain2305/857/orig -> origin/gh/anijain2305/857/orig 2025-09-07T07:38:46.3181811Z * [new branch] gh/anijain2305/858/base -> origin/gh/anijain2305/858/base 2025-09-07T07:38:46.3182245Z * [new branch] gh/anijain2305/858/head -> origin/gh/anijain2305/858/head 2025-09-07T07:38:46.3182862Z * [new branch] gh/anijain2305/858/orig -> origin/gh/anijain2305/858/orig 2025-09-07T07:38:46.3183684Z * [new branch] gh/anijain2305/859/base -> origin/gh/anijain2305/859/base 2025-09-07T07:38:46.3184116Z * [new branch] gh/anijain2305/859/head -> origin/gh/anijain2305/859/head 2025-09-07T07:38:46.3184682Z * [new branch] gh/anijain2305/859/orig -> origin/gh/anijain2305/859/orig 2025-09-07T07:38:46.3185503Z * [new branch] gh/anijain2305/860/base -> origin/gh/anijain2305/860/base 2025-09-07T07:38:46.3185982Z * [new branch] gh/anijain2305/860/head -> origin/gh/anijain2305/860/head 2025-09-07T07:38:46.3186518Z * [new branch] gh/anijain2305/860/orig -> origin/gh/anijain2305/860/orig 2025-09-07T07:38:46.3187531Z * [new branch] gh/anijain2305/861/base -> origin/gh/anijain2305/861/base 2025-09-07T07:38:46.3187919Z * [new branch] gh/anijain2305/861/head -> origin/gh/anijain2305/861/head 2025-09-07T07:38:46.3188453Z * [new branch] gh/anijain2305/861/orig -> origin/gh/anijain2305/861/orig 2025-09-07T07:38:46.3189315Z * [new branch] gh/anijain2305/862/base -> origin/gh/anijain2305/862/base 2025-09-07T07:38:46.3189835Z * [new branch] gh/anijain2305/862/head -> origin/gh/anijain2305/862/head 2025-09-07T07:38:46.3190332Z * [new branch] gh/anijain2305/862/orig -> origin/gh/anijain2305/862/orig 2025-09-07T07:38:46.3191202Z * [new branch] gh/anijain2305/863/base -> origin/gh/anijain2305/863/base 2025-09-07T07:38:46.3191732Z * [new branch] gh/anijain2305/863/head -> origin/gh/anijain2305/863/head 2025-09-07T07:38:46.3192265Z * [new branch] gh/anijain2305/863/orig -> origin/gh/anijain2305/863/orig 2025-09-07T07:38:46.3193411Z * [new branch] gh/anijain2305/864/base -> origin/gh/anijain2305/864/base 2025-09-07T07:38:46.3193735Z * [new branch] gh/anijain2305/864/head -> origin/gh/anijain2305/864/head 2025-09-07T07:38:46.3194173Z * [new branch] gh/anijain2305/864/orig -> origin/gh/anijain2305/864/orig 2025-09-07T07:38:46.3195014Z * [new branch] gh/anijain2305/865/base -> origin/gh/anijain2305/865/base 2025-09-07T07:38:46.3195456Z * [new branch] gh/anijain2305/865/head -> origin/gh/anijain2305/865/head 2025-09-07T07:38:46.3195973Z * [new branch] gh/anijain2305/865/orig -> origin/gh/anijain2305/865/orig 2025-09-07T07:38:46.3196936Z * [new branch] gh/anijain2305/866/base -> origin/gh/anijain2305/866/base 2025-09-07T07:38:46.3197478Z * [new branch] gh/anijain2305/866/head -> origin/gh/anijain2305/866/head 2025-09-07T07:38:46.3198302Z * [new branch] gh/anijain2305/866/orig -> origin/gh/anijain2305/866/orig 2025-09-07T07:38:46.3199151Z * [new branch] gh/anjali411/216/base -> origin/gh/anjali411/216/base 2025-09-07T07:38:46.3199596Z * [new branch] gh/anjali411/216/head -> origin/gh/anjali411/216/head 2025-09-07T07:38:46.3200231Z * [new branch] gh/anjali411/216/orig -> origin/gh/anjali411/216/orig 2025-09-07T07:38:46.3201159Z * [new branch] gh/ankitageorge/13/base -> origin/gh/ankitageorge/13/base 2025-09-07T07:38:46.3201631Z * [new branch] gh/ankitageorge/13/head -> origin/gh/ankitageorge/13/head 2025-09-07T07:38:46.3202201Z * [new branch] gh/ankitageorge/13/orig -> origin/gh/ankitageorge/13/orig 2025-09-07T07:38:46.3203178Z * [new branch] gh/ankitageorge/14/base -> origin/gh/ankitageorge/14/base 2025-09-07T07:38:46.3203603Z * [new branch] gh/ankitageorge/14/head -> origin/gh/ankitageorge/14/head 2025-09-07T07:38:46.3204369Z * [new branch] gh/ankitageorge/14/orig -> origin/gh/ankitageorge/14/orig 2025-09-07T07:38:46.3205220Z * [new branch] gh/ankitageorge/15/base -> origin/gh/ankitageorge/15/base 2025-09-07T07:38:46.3205663Z * [new branch] gh/ankitageorge/15/head -> origin/gh/ankitageorge/15/head 2025-09-07T07:38:46.3206244Z * [new branch] gh/ankitageorge/15/orig -> origin/gh/ankitageorge/15/orig 2025-09-07T07:38:46.3207450Z * [new branch] gh/ankitageorge/16/base -> origin/gh/ankitageorge/16/base 2025-09-07T07:38:46.3207893Z * [new branch] gh/ankitageorge/16/head -> origin/gh/ankitageorge/16/head 2025-09-07T07:38:46.3208510Z * [new branch] gh/ankitageorge/16/orig -> origin/gh/ankitageorge/16/orig 2025-09-07T07:38:46.3209481Z * [new branch] gh/ankitageorge/17/base -> origin/gh/ankitageorge/17/base 2025-09-07T07:38:46.3209904Z * [new branch] gh/ankitageorge/17/head -> origin/gh/ankitageorge/17/head 2025-09-07T07:38:46.3210452Z * [new branch] gh/ankitageorge/17/orig -> origin/gh/ankitageorge/17/orig 2025-09-07T07:38:46.3211396Z * [new branch] gh/ankitageorge/21/base -> origin/gh/ankitageorge/21/base 2025-09-07T07:38:46.3211781Z * [new branch] gh/ankitageorge/21/head -> origin/gh/ankitageorge/21/head 2025-09-07T07:38:46.3212319Z * [new branch] gh/ankitageorge/21/orig -> origin/gh/ankitageorge/21/orig 2025-09-07T07:38:46.3213478Z * [new branch] gh/anshul-si/1/base -> origin/gh/anshul-si/1/base 2025-09-07T07:38:46.3213975Z * [new branch] gh/anshul-si/1/head -> origin/gh/anshul-si/1/head 2025-09-07T07:38:46.3214796Z * [new branch] gh/anshul-si/15/base -> origin/gh/anshul-si/15/base 2025-09-07T07:38:46.3215255Z * [new branch] gh/anshul-si/15/head -> origin/gh/anshul-si/15/head 2025-09-07T07:38:46.3215787Z * [new branch] gh/anshul-si/15/orig -> origin/gh/anshul-si/15/orig 2025-09-07T07:38:46.3216727Z * [new branch] gh/anshul-si/16/base -> origin/gh/anshul-si/16/base 2025-09-07T07:38:46.3217273Z * [new branch] gh/anshul-si/16/head -> origin/gh/anshul-si/16/head 2025-09-07T07:38:46.3217829Z * [new branch] gh/anshul-si/16/orig -> origin/gh/anshul-si/16/orig 2025-09-07T07:38:46.3218715Z * [new branch] gh/anshul-si/17/base -> origin/gh/anshul-si/17/base 2025-09-07T07:38:46.3219284Z * [new branch] gh/anshul-si/17/head -> origin/gh/anshul-si/17/head 2025-09-07T07:38:46.3220032Z * [new branch] gh/anshul-si/17/orig -> origin/gh/anshul-si/17/orig 2025-09-07T07:38:46.3220776Z * [new branch] gh/anshul-si/18/base -> origin/gh/anshul-si/18/base 2025-09-07T07:38:46.3221356Z * [new branch] gh/anshul-si/18/head -> origin/gh/anshul-si/18/head 2025-09-07T07:38:46.3221946Z * [new branch] gh/anshul-si/18/orig -> origin/gh/anshul-si/18/orig 2025-09-07T07:38:46.3222787Z * [new branch] gh/anshul-si/19/base -> origin/gh/anshul-si/19/base 2025-09-07T07:38:46.3223317Z * [new branch] gh/anshul-si/19/head -> origin/gh/anshul-si/19/head 2025-09-07T07:38:46.3223912Z * [new branch] gh/anshul-si/19/orig -> origin/gh/anshul-si/19/orig 2025-09-07T07:38:46.3224557Z * [new branch] gh/anshul-si/2/base -> origin/gh/anshul-si/2/base 2025-09-07T07:38:46.3225044Z * [new branch] gh/anshul-si/2/head -> origin/gh/anshul-si/2/head 2025-09-07T07:38:46.3226020Z * [new branch] gh/anshul-si/20/base -> origin/gh/anshul-si/20/base 2025-09-07T07:38:46.3226590Z * [new branch] gh/anshul-si/20/head -> origin/gh/anshul-si/20/head 2025-09-07T07:38:46.3227091Z * [new branch] gh/anshul-si/20/orig -> origin/gh/anshul-si/20/orig 2025-09-07T07:38:46.3227877Z * [new branch] gh/anshul-si/21/base -> origin/gh/anshul-si/21/base 2025-09-07T07:38:46.3228350Z * [new branch] gh/anshul-si/21/head -> origin/gh/anshul-si/21/head 2025-09-07T07:38:46.3228875Z * [new branch] gh/anshul-si/21/orig -> origin/gh/anshul-si/21/orig 2025-09-07T07:38:46.3229695Z * [new branch] gh/anshul-si/22/base -> origin/gh/anshul-si/22/base 2025-09-07T07:38:46.3230159Z * [new branch] gh/anshul-si/22/head -> origin/gh/anshul-si/22/head 2025-09-07T07:38:46.3230899Z * [new branch] gh/anshul-si/22/orig -> origin/gh/anshul-si/22/orig 2025-09-07T07:38:46.3231396Z * [new branch] gh/anshul-si/23/base -> origin/gh/anshul-si/23/base 2025-09-07T07:38:46.3231987Z * [new branch] gh/anshul-si/23/head -> origin/gh/anshul-si/23/head 2025-09-07T07:38:46.3232522Z * [new branch] gh/anshul-si/23/orig -> origin/gh/anshul-si/23/orig 2025-09-07T07:38:46.3233389Z * [new branch] gh/anshul-si/24/base -> origin/gh/anshul-si/24/base 2025-09-07T07:38:46.3233921Z * [new branch] gh/anshul-si/24/head -> origin/gh/anshul-si/24/head 2025-09-07T07:38:46.3234415Z * [new branch] gh/anshul-si/24/orig -> origin/gh/anshul-si/24/orig 2025-09-07T07:38:46.3235360Z * [new branch] gh/anshul-si/25/base -> origin/gh/anshul-si/25/base 2025-09-07T07:38:46.3235906Z * [new branch] gh/anshul-si/25/head -> origin/gh/anshul-si/25/head 2025-09-07T07:38:46.3236426Z * [new branch] gh/anshul-si/25/orig -> origin/gh/anshul-si/25/orig 2025-09-07T07:38:46.3237241Z * [new branch] gh/anshul-si/26/base -> origin/gh/anshul-si/26/base 2025-09-07T07:38:46.3237700Z * [new branch] gh/anshul-si/26/head -> origin/gh/anshul-si/26/head 2025-09-07T07:38:46.3238223Z * [new branch] gh/anshul-si/26/orig -> origin/gh/anshul-si/26/orig 2025-09-07T07:38:46.3239092Z * [new branch] gh/anshul-si/27/base -> origin/gh/anshul-si/27/base 2025-09-07T07:38:46.3239576Z * [new branch] gh/anshul-si/27/head -> origin/gh/anshul-si/27/head 2025-09-07T07:38:46.3240078Z * [new branch] gh/anshul-si/27/orig -> origin/gh/anshul-si/27/orig 2025-09-07T07:38:46.3240822Z * [new branch] gh/anshul-si/28/base -> origin/gh/anshul-si/28/base 2025-09-07T07:38:46.3241276Z * [new branch] gh/anshul-si/28/head -> origin/gh/anshul-si/28/head 2025-09-07T07:38:46.3241860Z * [new branch] gh/anshul-si/28/orig -> origin/gh/anshul-si/28/orig 2025-09-07T07:38:46.3242527Z * [new branch] gh/anshul-si/29/base -> origin/gh/anshul-si/29/base 2025-09-07T07:38:46.3243207Z * [new branch] gh/anshul-si/29/head -> origin/gh/anshul-si/29/head 2025-09-07T07:38:46.3243672Z * [new branch] gh/anshul-si/29/orig -> origin/gh/anshul-si/29/orig 2025-09-07T07:38:46.3244410Z * [new branch] gh/anshul-si/3/base -> origin/gh/anshul-si/3/base 2025-09-07T07:38:46.3245055Z * [new branch] gh/anshul-si/3/head -> origin/gh/anshul-si/3/head 2025-09-07T07:38:46.3245679Z * [new branch] gh/anshul-si/4/base -> origin/gh/anshul-si/4/base 2025-09-07T07:38:46.3246081Z * [new branch] gh/anshul-si/4/head -> origin/gh/anshul-si/4/head 2025-09-07T07:38:46.3247100Z * [new branch] gh/anshul-si/5/base -> origin/gh/anshul-si/5/base 2025-09-07T07:38:46.3247688Z * [new branch] gh/anshul-si/5/head -> origin/gh/anshul-si/5/head 2025-09-07T07:38:46.3248771Z * [new branch] gh/aorenste/132/base -> origin/gh/aorenste/132/base 2025-09-07T07:38:46.3249374Z * [new branch] gh/aorenste/132/head -> origin/gh/aorenste/132/head 2025-09-07T07:38:46.3250415Z * [new branch] gh/bdhirsh/650/base -> origin/gh/bdhirsh/650/base 2025-09-07T07:38:46.3251173Z * [new branch] gh/bdhirsh/650/head -> origin/gh/bdhirsh/650/head 2025-09-07T07:38:46.3251599Z * [new branch] gh/bdhirsh/650/orig -> origin/gh/bdhirsh/650/orig 2025-09-07T07:38:46.3252509Z * [new branch] gh/bdhirsh/663/base -> origin/gh/bdhirsh/663/base 2025-09-07T07:38:46.3253371Z * [new branch] gh/bdhirsh/663/head -> origin/gh/bdhirsh/663/head 2025-09-07T07:38:46.3253583Z * [new branch] gh/bdhirsh/663/orig -> origin/gh/bdhirsh/663/orig 2025-09-07T07:38:46.3254471Z * [new branch] gh/bdhirsh/665/base -> origin/gh/bdhirsh/665/base 2025-09-07T07:38:46.3254899Z * [new branch] gh/bdhirsh/665/head -> origin/gh/bdhirsh/665/head 2025-09-07T07:38:46.3255609Z * [new branch] gh/bdhirsh/665/orig -> origin/gh/bdhirsh/665/orig 2025-09-07T07:38:46.3256614Z * [new branch] gh/bdhirsh/666/base -> origin/gh/bdhirsh/666/base 2025-09-07T07:38:46.3257197Z * [new branch] gh/bdhirsh/666/head -> origin/gh/bdhirsh/666/head 2025-09-07T07:38:46.3257674Z * [new branch] gh/bdhirsh/666/orig -> origin/gh/bdhirsh/666/orig 2025-09-07T07:38:46.3258706Z * [new branch] gh/bdhirsh/667/base -> origin/gh/bdhirsh/667/base 2025-09-07T07:38:46.3259164Z * [new branch] gh/bdhirsh/667/head -> origin/gh/bdhirsh/667/head 2025-09-07T07:38:46.3259815Z * [new branch] gh/bdhirsh/667/orig -> origin/gh/bdhirsh/667/orig 2025-09-07T07:38:46.3260526Z * [new branch] gh/bdhirsh/668/base -> origin/gh/bdhirsh/668/base 2025-09-07T07:38:46.3260985Z * [new branch] gh/bdhirsh/668/head -> origin/gh/bdhirsh/668/head 2025-09-07T07:38:46.3261565Z * [new branch] gh/bdhirsh/668/orig -> origin/gh/bdhirsh/668/orig 2025-09-07T07:38:46.3262471Z * [new branch] gh/bdhirsh/669/base -> origin/gh/bdhirsh/669/base 2025-09-07T07:38:46.3262905Z * [new branch] gh/bdhirsh/669/head -> origin/gh/bdhirsh/669/head 2025-09-07T07:38:46.3263490Z * [new branch] gh/bdhirsh/669/orig -> origin/gh/bdhirsh/669/orig 2025-09-07T07:38:46.3264491Z * [new branch] gh/bdhirsh/670/base -> origin/gh/bdhirsh/670/base 2025-09-07T07:38:46.3265171Z * [new branch] gh/bdhirsh/670/head -> origin/gh/bdhirsh/670/head 2025-09-07T07:38:46.3265810Z * [new branch] gh/bdhirsh/670/orig -> origin/gh/bdhirsh/670/orig 2025-09-07T07:38:46.3266831Z * [new branch] gh/benjaminglass1/100/base -> origin/gh/benjaminglass1/100/base 2025-09-07T07:38:46.3267275Z * [new branch] gh/benjaminglass1/100/head -> origin/gh/benjaminglass1/100/head 2025-09-07T07:38:46.3267961Z * [new branch] gh/benjaminglass1/100/orig -> origin/gh/benjaminglass1/100/orig 2025-09-07T07:38:46.3268771Z * [new branch] gh/benjaminglass1/101/base -> origin/gh/benjaminglass1/101/base 2025-09-07T07:38:46.3269230Z * [new branch] gh/benjaminglass1/101/head -> origin/gh/benjaminglass1/101/head 2025-09-07T07:38:46.3269891Z * [new branch] gh/benjaminglass1/101/orig -> origin/gh/benjaminglass1/101/orig 2025-09-07T07:38:46.3270665Z * [new branch] gh/benjaminglass1/102/base -> origin/gh/benjaminglass1/102/base 2025-09-07T07:38:46.3271122Z * [new branch] gh/benjaminglass1/102/head -> origin/gh/benjaminglass1/102/head 2025-09-07T07:38:46.3271715Z * [new branch] gh/benjaminglass1/102/orig -> origin/gh/benjaminglass1/102/orig 2025-09-07T07:38:46.3272516Z * [new branch] gh/benjaminglass1/103/base -> origin/gh/benjaminglass1/103/base 2025-09-07T07:38:46.3272953Z * [new branch] gh/benjaminglass1/103/head -> origin/gh/benjaminglass1/103/head 2025-09-07T07:38:46.3273550Z * [new branch] gh/benjaminglass1/103/orig -> origin/gh/benjaminglass1/103/orig 2025-09-07T07:38:46.3274338Z * [new branch] gh/benjaminglass1/104/base -> origin/gh/benjaminglass1/104/base 2025-09-07T07:38:46.3274830Z * [new branch] gh/benjaminglass1/104/head -> origin/gh/benjaminglass1/104/head 2025-09-07T07:38:46.3275425Z * [new branch] gh/benjaminglass1/104/orig -> origin/gh/benjaminglass1/104/orig 2025-09-07T07:38:46.3276152Z * [new branch] gh/benjaminglass1/105/base -> origin/gh/benjaminglass1/105/base 2025-09-07T07:38:46.3276580Z * [new branch] gh/benjaminglass1/105/head -> origin/gh/benjaminglass1/105/head 2025-09-07T07:38:46.3277224Z * [new branch] gh/benjaminglass1/105/orig -> origin/gh/benjaminglass1/105/orig 2025-09-07T07:38:46.3277923Z * [new branch] gh/benjaminglass1/106/base -> origin/gh/benjaminglass1/106/base 2025-09-07T07:38:46.3278357Z * [new branch] gh/benjaminglass1/106/head -> origin/gh/benjaminglass1/106/head 2025-09-07T07:38:46.3278997Z * [new branch] gh/benjaminglass1/106/orig -> origin/gh/benjaminglass1/106/orig 2025-09-07T07:38:46.3279730Z * [new branch] gh/benjaminglass1/79/base -> origin/gh/benjaminglass1/79/base 2025-09-07T07:38:46.3280211Z * [new branch] gh/benjaminglass1/79/head -> origin/gh/benjaminglass1/79/head 2025-09-07T07:38:46.3280800Z * [new branch] gh/benjaminglass1/79/orig -> origin/gh/benjaminglass1/79/orig 2025-09-07T07:38:46.3281602Z * [new branch] gh/benjaminglass1/86/base -> origin/gh/benjaminglass1/86/base 2025-09-07T07:38:46.3282068Z * [new branch] gh/benjaminglass1/86/head -> origin/gh/benjaminglass1/86/head 2025-09-07T07:38:46.3282729Z * [new branch] gh/benjaminglass1/86/orig -> origin/gh/benjaminglass1/86/orig 2025-09-07T07:38:46.3283430Z * [new branch] gh/benjaminglass1/89/base -> origin/gh/benjaminglass1/89/base 2025-09-07T07:38:46.3284085Z * [new branch] gh/benjaminglass1/89/head -> origin/gh/benjaminglass1/89/head 2025-09-07T07:38:46.3284518Z * [new branch] gh/benjaminglass1/89/orig -> origin/gh/benjaminglass1/89/orig 2025-09-07T07:38:46.3285324Z * [new branch] gh/benjaminglass1/91/base -> origin/gh/benjaminglass1/91/base 2025-09-07T07:38:46.3285762Z * [new branch] gh/benjaminglass1/91/head -> origin/gh/benjaminglass1/91/head 2025-09-07T07:38:46.3286364Z * [new branch] gh/benjaminglass1/91/orig -> origin/gh/benjaminglass1/91/orig 2025-09-07T07:38:46.3287141Z * [new branch] gh/benjaminglass1/93/base -> origin/gh/benjaminglass1/93/base 2025-09-07T07:38:46.3287588Z * [new branch] gh/benjaminglass1/93/head -> origin/gh/benjaminglass1/93/head 2025-09-07T07:38:46.3288270Z * [new branch] gh/benjaminglass1/93/orig -> origin/gh/benjaminglass1/93/orig 2025-09-07T07:38:46.3289006Z * [new branch] gh/benjaminglass1/95/base -> origin/gh/benjaminglass1/95/base 2025-09-07T07:38:46.3289481Z * [new branch] gh/benjaminglass1/95/head -> origin/gh/benjaminglass1/95/head 2025-09-07T07:38:46.3290074Z * [new branch] gh/benjaminglass1/95/orig -> origin/gh/benjaminglass1/95/orig 2025-09-07T07:38:46.3290854Z * [new branch] gh/benjaminglass1/97/base -> origin/gh/benjaminglass1/97/base 2025-09-07T07:38:46.3291293Z * [new branch] gh/benjaminglass1/97/head -> origin/gh/benjaminglass1/97/head 2025-09-07T07:38:46.3291886Z * [new branch] gh/benjaminglass1/97/orig -> origin/gh/benjaminglass1/97/orig 2025-09-07T07:38:46.3292651Z * [new branch] gh/benjaminglass1/99/base -> origin/gh/benjaminglass1/99/base 2025-09-07T07:38:46.3293333Z * [new branch] gh/benjaminglass1/99/head -> origin/gh/benjaminglass1/99/head 2025-09-07T07:38:46.3293998Z * [new branch] gh/benjaminglass1/99/orig -> origin/gh/benjaminglass1/99/orig 2025-09-07T07:38:46.3294988Z * [new branch] gh/bobrenjc93/514/base -> origin/gh/bobrenjc93/514/base 2025-09-07T07:38:46.3295416Z * [new branch] gh/bobrenjc93/514/head -> origin/gh/bobrenjc93/514/head 2025-09-07T07:38:46.3295995Z * [new branch] gh/bobrenjc93/514/orig -> origin/gh/bobrenjc93/514/orig 2025-09-07T07:38:46.3296705Z * [new branch] gh/bobrenjc93/521/base -> origin/gh/bobrenjc93/521/base 2025-09-07T07:38:46.3297173Z * [new branch] gh/bobrenjc93/521/head -> origin/gh/bobrenjc93/521/head 2025-09-07T07:38:46.3297669Z * [new branch] gh/bobrenjc93/521/orig -> origin/gh/bobrenjc93/521/orig 2025-09-07T07:38:46.3298502Z * [new branch] gh/bobrenjc93/522/base -> origin/gh/bobrenjc93/522/base 2025-09-07T07:38:46.3298961Z * [new branch] gh/bobrenjc93/522/head -> origin/gh/bobrenjc93/522/head 2025-09-07T07:38:46.3299459Z * [new branch] gh/bobrenjc93/522/orig -> origin/gh/bobrenjc93/522/orig 2025-09-07T07:38:46.3300279Z * [new branch] gh/bobrenjc93/525/base -> origin/gh/bobrenjc93/525/base 2025-09-07T07:38:46.3302124Z * [new branch] gh/bobrenjc93/525/head -> origin/gh/bobrenjc93/525/head 2025-09-07T07:38:46.3302889Z * [new branch] gh/bobrenjc93/525/orig -> origin/gh/bobrenjc93/525/orig 2025-09-07T07:38:46.3303233Z * [new branch] gh/bobrenjc93/526/base -> origin/gh/bobrenjc93/526/base 2025-09-07T07:38:46.3303546Z * [new branch] gh/bobrenjc93/526/head -> origin/gh/bobrenjc93/526/head 2025-09-07T07:38:46.3303917Z * [new branch] gh/bobrenjc93/526/orig -> origin/gh/bobrenjc93/526/orig 2025-09-07T07:38:46.3304230Z * [new branch] gh/bobrenjc93/527/base -> origin/gh/bobrenjc93/527/base 2025-09-07T07:38:46.3304667Z * [new branch] gh/bobrenjc93/527/head -> origin/gh/bobrenjc93/527/head 2025-09-07T07:38:46.3305154Z * [new branch] gh/bobrenjc93/527/orig -> origin/gh/bobrenjc93/527/orig 2025-09-07T07:38:46.3305746Z * [new branch] gh/bobrenjc93/528/base -> origin/gh/bobrenjc93/528/base 2025-09-07T07:38:46.3306262Z * [new branch] gh/bobrenjc93/528/head -> origin/gh/bobrenjc93/528/head 2025-09-07T07:38:46.3306773Z * [new branch] gh/bobrenjc93/528/orig -> origin/gh/bobrenjc93/528/orig 2025-09-07T07:38:46.3307677Z * [new branch] gh/bobrenjc93/529/base -> origin/gh/bobrenjc93/529/base 2025-09-07T07:38:46.3308065Z * [new branch] gh/bobrenjc93/529/head -> origin/gh/bobrenjc93/529/head 2025-09-07T07:38:46.3308602Z * [new branch] gh/bobrenjc93/529/orig -> origin/gh/bobrenjc93/529/orig 2025-09-07T07:38:46.3309404Z * [new branch] gh/bobrenjc93/535/base -> origin/gh/bobrenjc93/535/base 2025-09-07T07:38:46.3309835Z * [new branch] gh/bobrenjc93/535/head -> origin/gh/bobrenjc93/535/head 2025-09-07T07:38:46.3310352Z * [new branch] gh/bobrenjc93/535/orig -> origin/gh/bobrenjc93/535/orig 2025-09-07T07:38:46.3311194Z * [new branch] gh/bobrenjc93/537/base -> origin/gh/bobrenjc93/537/base 2025-09-07T07:38:46.3312158Z * [new branch] gh/bobrenjc93/537/head -> origin/gh/bobrenjc93/537/head 2025-09-07T07:38:46.3312478Z * [new branch] gh/bobrenjc93/537/orig -> origin/gh/bobrenjc93/537/orig 2025-09-07T07:38:46.3313453Z * [new branch] gh/bobrenjc93/539/base -> origin/gh/bobrenjc93/539/base 2025-09-07T07:38:46.3313900Z * [new branch] gh/bobrenjc93/539/head -> origin/gh/bobrenjc93/539/head 2025-09-07T07:38:46.3314474Z * [new branch] gh/bobrenjc93/539/orig -> origin/gh/bobrenjc93/539/orig 2025-09-07T07:38:46.3315300Z * [new branch] gh/bobrenjc93/540/base -> origin/gh/bobrenjc93/540/base 2025-09-07T07:38:46.3315824Z * [new branch] gh/bobrenjc93/540/head -> origin/gh/bobrenjc93/540/head 2025-09-07T07:38:46.3316425Z * [new branch] gh/bobrenjc93/540/orig -> origin/gh/bobrenjc93/540/orig 2025-09-07T07:38:46.3317180Z * [new branch] gh/bobrenjc93/541/base -> origin/gh/bobrenjc93/541/base 2025-09-07T07:38:46.3317622Z * [new branch] gh/bobrenjc93/541/head -> origin/gh/bobrenjc93/541/head 2025-09-07T07:38:46.3318157Z * [new branch] gh/bobrenjc93/541/orig -> origin/gh/bobrenjc93/541/orig 2025-09-07T07:38:46.3318911Z * [new branch] gh/bobrenjc93/542/base -> origin/gh/bobrenjc93/542/base 2025-09-07T07:38:46.3319367Z * [new branch] gh/bobrenjc93/542/head -> origin/gh/bobrenjc93/542/head 2025-09-07T07:38:46.3319927Z * [new branch] gh/bobrenjc93/542/orig -> origin/gh/bobrenjc93/542/orig 2025-09-07T07:38:46.3320879Z * [new branch] gh/bobrenjc93/543/base -> origin/gh/bobrenjc93/543/base 2025-09-07T07:38:46.3321307Z * [new branch] gh/bobrenjc93/543/head -> origin/gh/bobrenjc93/543/head 2025-09-07T07:38:46.3321872Z * [new branch] gh/bobrenjc93/543/orig -> origin/gh/bobrenjc93/543/orig 2025-09-07T07:38:46.3322783Z * [new branch] gh/bobrenjc93/544/base -> origin/gh/bobrenjc93/544/base 2025-09-07T07:38:46.3323109Z * [new branch] gh/bobrenjc93/544/head -> origin/gh/bobrenjc93/544/head 2025-09-07T07:38:46.3323916Z * [new branch] gh/bobrenjc93/544/orig -> origin/gh/bobrenjc93/544/orig 2025-09-07T07:38:46.3324683Z * [new branch] gh/bobrenjc93/545/base -> origin/gh/bobrenjc93/545/base 2025-09-07T07:38:46.3325202Z * [new branch] gh/bobrenjc93/545/head -> origin/gh/bobrenjc93/545/head 2025-09-07T07:38:46.3325757Z * [new branch] gh/bobrenjc93/545/orig -> origin/gh/bobrenjc93/545/orig 2025-09-07T07:38:46.3326966Z * [new branch] gh/bobrenjc93/546/base -> origin/gh/bobrenjc93/546/base 2025-09-07T07:38:46.3327432Z * [new branch] gh/bobrenjc93/546/head -> origin/gh/bobrenjc93/546/head 2025-09-07T07:38:46.3327981Z * [new branch] gh/bobrenjc93/546/orig -> origin/gh/bobrenjc93/546/orig 2025-09-07T07:38:46.3329263Z * [new branch] gh/bobrenjc93/547/base -> origin/gh/bobrenjc93/547/base 2025-09-07T07:38:46.3329746Z * [new branch] gh/bobrenjc93/547/head -> origin/gh/bobrenjc93/547/head 2025-09-07T07:38:46.3330307Z * [new branch] gh/bobrenjc93/547/orig -> origin/gh/bobrenjc93/547/orig 2025-09-07T07:38:46.3331190Z * [new branch] gh/bobrenjc93/548/base -> origin/gh/bobrenjc93/548/base 2025-09-07T07:38:46.3331592Z * [new branch] gh/bobrenjc93/548/head -> origin/gh/bobrenjc93/548/head 2025-09-07T07:38:46.3332162Z * [new branch] gh/bobrenjc93/548/orig -> origin/gh/bobrenjc93/548/orig 2025-09-07T07:38:46.3332939Z * [new branch] gh/bobrenjc93/549/base -> origin/gh/bobrenjc93/549/base 2025-09-07T07:38:46.3333398Z * [new branch] gh/bobrenjc93/549/head -> origin/gh/bobrenjc93/549/head 2025-09-07T07:38:46.3333963Z * [new branch] gh/bobrenjc93/549/orig -> origin/gh/bobrenjc93/549/orig 2025-09-07T07:38:46.3334961Z * [new branch] gh/bobrenjc93/550/base -> origin/gh/bobrenjc93/550/base 2025-09-07T07:38:46.3335426Z * [new branch] gh/bobrenjc93/550/head -> origin/gh/bobrenjc93/550/head 2025-09-07T07:38:46.3335962Z * [new branch] gh/bobrenjc93/550/orig -> origin/gh/bobrenjc93/550/orig 2025-09-07T07:38:46.3336967Z * [new branch] gh/bobrenjc93/551/base -> origin/gh/bobrenjc93/551/base 2025-09-07T07:38:46.3337434Z * [new branch] gh/bobrenjc93/551/head -> origin/gh/bobrenjc93/551/head 2025-09-07T07:38:46.3338005Z * [new branch] gh/bobrenjc93/551/orig -> origin/gh/bobrenjc93/551/orig 2025-09-07T07:38:46.3339197Z * [new branch] gh/bobrenjc93/552/base -> origin/gh/bobrenjc93/552/base 2025-09-07T07:38:46.3339669Z * [new branch] gh/bobrenjc93/552/head -> origin/gh/bobrenjc93/552/head 2025-09-07T07:38:46.3340190Z * [new branch] gh/bobrenjc93/552/orig -> origin/gh/bobrenjc93/552/orig 2025-09-07T07:38:46.3341125Z * [new branch] gh/bobrenjc93/553/base -> origin/gh/bobrenjc93/553/base 2025-09-07T07:38:46.3341554Z * [new branch] gh/bobrenjc93/553/head -> origin/gh/bobrenjc93/553/head 2025-09-07T07:38:46.3342063Z * [new branch] gh/bobrenjc93/553/orig -> origin/gh/bobrenjc93/553/orig 2025-09-07T07:38:46.3342722Z * [new branch] gh/bobrenjc93/554/base -> origin/gh/bobrenjc93/554/base 2025-09-07T07:38:46.3343281Z * [new branch] gh/bobrenjc93/554/head -> origin/gh/bobrenjc93/554/head 2025-09-07T07:38:46.3343791Z * [new branch] gh/bobrenjc93/554/orig -> origin/gh/bobrenjc93/554/orig 2025-09-07T07:38:46.3344762Z * [new branch] gh/bobrenjc93/555/base -> origin/gh/bobrenjc93/555/base 2025-09-07T07:38:46.3345216Z * [new branch] gh/bobrenjc93/555/head -> origin/gh/bobrenjc93/555/head 2025-09-07T07:38:46.3345741Z * [new branch] gh/bobrenjc93/555/orig -> origin/gh/bobrenjc93/555/orig 2025-09-07T07:38:46.3346576Z * [new branch] gh/bobrenjc93/556/base -> origin/gh/bobrenjc93/556/base 2025-09-07T07:38:46.3346993Z * [new branch] gh/bobrenjc93/556/head -> origin/gh/bobrenjc93/556/head 2025-09-07T07:38:46.3347518Z * [new branch] gh/bobrenjc93/556/orig -> origin/gh/bobrenjc93/556/orig 2025-09-07T07:38:46.3348573Z * [new branch] gh/briancoutinho/2/base -> origin/gh/briancoutinho/2/base 2025-09-07T07:38:46.3349067Z * [new branch] gh/briancoutinho/2/head -> origin/gh/briancoutinho/2/head 2025-09-07T07:38:46.3350117Z * [new branch] gh/c00w/23/base -> origin/gh/c00w/23/base 2025-09-07T07:38:46.3350694Z * [new branch] gh/c00w/23/head -> origin/gh/c00w/23/head 2025-09-07T07:38:46.3351585Z * [new branch] gh/c00w/48/base -> origin/gh/c00w/48/base 2025-09-07T07:38:46.3352079Z * [new branch] gh/c00w/48/head -> origin/gh/c00w/48/head 2025-09-07T07:38:46.3352637Z * [new branch] gh/c00w/48/orig -> origin/gh/c00w/48/orig 2025-09-07T07:38:46.3353654Z * [new branch] gh/c00w/53/base -> origin/gh/c00w/53/base 2025-09-07T07:38:46.3353982Z * [new branch] gh/c00w/53/head -> origin/gh/c00w/53/head 2025-09-07T07:38:46.3354489Z * [new branch] gh/c00w/53/orig -> origin/gh/c00w/53/orig 2025-09-07T07:38:46.3355261Z * [new branch] gh/c00w/54/base -> origin/gh/c00w/54/base 2025-09-07T07:38:46.3355683Z * [new branch] gh/c00w/54/head -> origin/gh/c00w/54/head 2025-09-07T07:38:46.3356217Z * [new branch] gh/c00w/54/orig -> origin/gh/c00w/54/orig 2025-09-07T07:38:46.3357077Z * [new branch] gh/c00w/55/base -> origin/gh/c00w/55/base 2025-09-07T07:38:46.3357618Z * [new branch] gh/c00w/55/head -> origin/gh/c00w/55/head 2025-09-07T07:38:46.3358137Z * [new branch] gh/c00w/55/orig -> origin/gh/c00w/55/orig 2025-09-07T07:38:46.3358904Z * [new branch] gh/c00w/56/base -> origin/gh/c00w/56/base 2025-09-07T07:38:46.3359407Z * [new branch] gh/c00w/56/head -> origin/gh/c00w/56/head 2025-09-07T07:38:46.3360180Z * [new branch] gh/c00w/56/orig -> origin/gh/c00w/56/orig 2025-09-07T07:38:46.3361094Z * [new branch] gh/clee2000/1/base -> origin/gh/clee2000/1/base 2025-09-07T07:38:46.3361664Z * [new branch] gh/clee2000/1/head -> origin/gh/clee2000/1/head 2025-09-07T07:38:46.3362175Z * [new branch] gh/clee2000/1/orig -> origin/gh/clee2000/1/orig 2025-09-07T07:38:46.3363234Z * [new branch] gh/coconutruben/1/base -> origin/gh/coconutruben/1/base 2025-09-07T07:38:46.3363815Z * [new branch] gh/coconutruben/1/head -> origin/gh/coconutruben/1/head 2025-09-07T07:38:46.3364778Z * [new branch] gh/coconutruben/11/base -> origin/gh/coconutruben/11/base 2025-09-07T07:38:46.3365310Z * [new branch] gh/coconutruben/11/head -> origin/gh/coconutruben/11/head 2025-09-07T07:38:46.3365889Z * [new branch] gh/coconutruben/11/orig -> origin/gh/coconutruben/11/orig 2025-09-07T07:38:46.3367128Z * [new branch] gh/coconutruben/12/base -> origin/gh/coconutruben/12/base 2025-09-07T07:38:46.3367770Z * [new branch] gh/coconutruben/12/head -> origin/gh/coconutruben/12/head 2025-09-07T07:38:46.3368548Z * [new branch] gh/coconutruben/12/orig -> origin/gh/coconutruben/12/orig 2025-09-07T07:38:46.3369312Z * [new branch] gh/coconutruben/13/base -> origin/gh/coconutruben/13/base 2025-09-07T07:38:46.3369831Z * [new branch] gh/coconutruben/13/head -> origin/gh/coconutruben/13/head 2025-09-07T07:38:46.3370610Z * [new branch] gh/coconutruben/13/orig -> origin/gh/coconutruben/13/orig 2025-09-07T07:38:46.3371464Z * [new branch] gh/coconutruben/14/base -> origin/gh/coconutruben/14/base 2025-09-07T07:38:46.3371971Z * [new branch] gh/coconutruben/14/head -> origin/gh/coconutruben/14/head 2025-09-07T07:38:46.3372511Z * [new branch] gh/coconutruben/14/orig -> origin/gh/coconutruben/14/orig 2025-09-07T07:38:46.3373630Z * [new branch] gh/coconutruben/15/base -> origin/gh/coconutruben/15/base 2025-09-07T07:38:46.3374253Z * [new branch] gh/coconutruben/15/head -> origin/gh/coconutruben/15/head 2025-09-07T07:38:46.3374903Z * [new branch] gh/coconutruben/15/orig -> origin/gh/coconutruben/15/orig 2025-09-07T07:38:46.3375739Z * [new branch] gh/coconutruben/16/base -> origin/gh/coconutruben/16/base 2025-09-07T07:38:46.3376158Z * [new branch] gh/coconutruben/16/head -> origin/gh/coconutruben/16/head 2025-09-07T07:38:46.3376668Z * [new branch] gh/coconutruben/16/orig -> origin/gh/coconutruben/16/orig 2025-09-07T07:38:46.3377687Z * [new branch] gh/coconutruben/17/base -> origin/gh/coconutruben/17/base 2025-09-07T07:38:46.3378236Z * [new branch] gh/coconutruben/17/head -> origin/gh/coconutruben/17/head 2025-09-07T07:38:46.3379132Z * [new branch] gh/coconutruben/17/orig -> origin/gh/coconutruben/17/orig 2025-09-07T07:38:46.3379638Z * [new branch] gh/coconutruben/18/base -> origin/gh/coconutruben/18/base 2025-09-07T07:38:46.3380212Z * [new branch] gh/coconutruben/18/head -> origin/gh/coconutruben/18/head 2025-09-07T07:38:46.3381034Z * [new branch] gh/coconutruben/18/orig -> origin/gh/coconutruben/18/orig 2025-09-07T07:38:46.3383440Z * [new branch] gh/coconutruben/19/base -> origin/gh/coconutruben/19/base 2025-09-07T07:38:46.3383979Z * [new branch] gh/coconutruben/19/head -> origin/gh/coconutruben/19/head 2025-09-07T07:38:46.3384548Z * [new branch] gh/coconutruben/19/orig -> origin/gh/coconutruben/19/orig 2025-09-07T07:38:46.3385544Z * [new branch] gh/coconutruben/20/base -> origin/gh/coconutruben/20/base 2025-09-07T07:38:46.3386070Z * [new branch] gh/coconutruben/20/head -> origin/gh/coconutruben/20/head 2025-09-07T07:38:46.3386698Z * [new branch] gh/coconutruben/20/orig -> origin/gh/coconutruben/20/orig 2025-09-07T07:38:46.3387630Z * [new branch] gh/coconutruben/21/base -> origin/gh/coconutruben/21/base 2025-09-07T07:38:46.3388171Z * [new branch] gh/coconutruben/21/head -> origin/gh/coconutruben/21/head 2025-09-07T07:38:46.3388718Z * [new branch] gh/coconutruben/21/orig -> origin/gh/coconutruben/21/orig 2025-09-07T07:38:46.3389938Z * [new branch] gh/coconutruben/22/base -> origin/gh/coconutruben/22/base 2025-09-07T07:38:46.3390395Z * [new branch] gh/coconutruben/22/head -> origin/gh/coconutruben/22/head 2025-09-07T07:38:46.3391049Z * [new branch] gh/coconutruben/22/orig -> origin/gh/coconutruben/22/orig 2025-09-07T07:38:46.3391927Z * [new branch] gh/coconutruben/24/base -> origin/gh/coconutruben/24/base 2025-09-07T07:38:46.3392481Z * [new branch] gh/coconutruben/24/head -> origin/gh/coconutruben/24/head 2025-09-07T07:38:46.3393304Z * [new branch] gh/coconutruben/24/orig -> origin/gh/coconutruben/24/orig 2025-09-07T07:38:46.3394424Z * [new branch] gh/coconutruben/25/base -> origin/gh/coconutruben/25/base 2025-09-07T07:38:46.3395279Z * [new branch] gh/coconutruben/25/head -> origin/gh/coconutruben/25/head 2025-09-07T07:38:46.3396060Z * [new branch] gh/coconutruben/25/orig -> origin/gh/coconutruben/25/orig 2025-09-07T07:38:46.3396829Z * [new branch] gh/coconutruben/28/base -> origin/gh/coconutruben/28/base 2025-09-07T07:38:46.3397317Z * [new branch] gh/coconutruben/28/head -> origin/gh/coconutruben/28/head 2025-09-07T07:38:46.3397950Z * [new branch] gh/coconutruben/28/orig -> origin/gh/coconutruben/28/orig 2025-09-07T07:38:46.3398892Z * [new branch] gh/coconutruben/29/base -> origin/gh/coconutruben/29/base 2025-09-07T07:38:46.3399400Z * [new branch] gh/coconutruben/29/head -> origin/gh/coconutruben/29/head 2025-09-07T07:38:46.3399989Z * [new branch] gh/coconutruben/29/orig -> origin/gh/coconutruben/29/orig 2025-09-07T07:38:46.3400904Z * [new branch] gh/coconutruben/30/base -> origin/gh/coconutruben/30/base 2025-09-07T07:38:46.3401393Z * [new branch] gh/coconutruben/30/head -> origin/gh/coconutruben/30/head 2025-09-07T07:38:46.3401966Z * [new branch] gh/coconutruben/30/orig -> origin/gh/coconutruben/30/orig 2025-09-07T07:38:46.3403194Z * [new branch] gh/coconutruben/31/base -> origin/gh/coconutruben/31/base 2025-09-07T07:38:46.3403682Z * [new branch] gh/coconutruben/31/head -> origin/gh/coconutruben/31/head 2025-09-07T07:38:46.3404460Z * [new branch] gh/coconutruben/31/orig -> origin/gh/coconutruben/31/orig 2025-09-07T07:38:46.3405705Z * [new branch] gh/coconutruben/32/base -> origin/gh/coconutruben/32/base 2025-09-07T07:38:46.3406231Z * [new branch] gh/coconutruben/32/head -> origin/gh/coconutruben/32/head 2025-09-07T07:38:46.3406893Z * [new branch] gh/coconutruben/32/orig -> origin/gh/coconutruben/32/orig 2025-09-07T07:38:46.3408035Z * [new branch] gh/coconutruben/33/base -> origin/gh/coconutruben/33/base 2025-09-07T07:38:46.3408547Z * [new branch] gh/coconutruben/33/head -> origin/gh/coconutruben/33/head 2025-09-07T07:38:46.3409170Z * [new branch] gh/coconutruben/33/orig -> origin/gh/coconutruben/33/orig 2025-09-07T07:38:46.3409987Z * [new branch] gh/coconutruben/34/base -> origin/gh/coconutruben/34/base 2025-09-07T07:38:46.3410387Z * [new branch] gh/coconutruben/34/head -> origin/gh/coconutruben/34/head 2025-09-07T07:38:46.3410951Z * [new branch] gh/coconutruben/34/orig -> origin/gh/coconutruben/34/orig 2025-09-07T07:38:46.3411794Z * [new branch] gh/coconutruben/35/base -> origin/gh/coconutruben/35/base 2025-09-07T07:38:46.3412269Z * [new branch] gh/coconutruben/35/head -> origin/gh/coconutruben/35/head 2025-09-07T07:38:46.3412844Z * [new branch] gh/coconutruben/35/orig -> origin/gh/coconutruben/35/orig 2025-09-07T07:38:46.3414785Z * [new branch] gh/coconutruben/36/base -> origin/gh/coconutruben/36/base 2025-09-07T07:38:46.3415658Z * [new branch] gh/coconutruben/36/head -> origin/gh/coconutruben/36/head 2025-09-07T07:38:46.3417015Z * [new branch] gh/coconutruben/36/orig -> origin/gh/coconutruben/36/orig 2025-09-07T07:38:46.3417995Z * [new branch] gh/coconutruben/37/base -> origin/gh/coconutruben/37/base 2025-09-07T07:38:46.3418403Z * [new branch] gh/coconutruben/37/head -> origin/gh/coconutruben/37/head 2025-09-07T07:38:46.3418947Z * [new branch] gh/coconutruben/37/orig -> origin/gh/coconutruben/37/orig 2025-09-07T07:38:46.3419922Z * [new branch] gh/coconutruben/38/base -> origin/gh/coconutruben/38/base 2025-09-07T07:38:46.3420509Z * [new branch] gh/coconutruben/38/head -> origin/gh/coconutruben/38/head 2025-09-07T07:38:46.3421050Z * [new branch] gh/coconutruben/38/orig -> origin/gh/coconutruben/38/orig 2025-09-07T07:38:46.3421988Z * [new branch] gh/coconutruben/39/base -> origin/gh/coconutruben/39/base 2025-09-07T07:38:46.3422439Z * [new branch] gh/coconutruben/39/head -> origin/gh/coconutruben/39/head 2025-09-07T07:38:46.3422944Z * [new branch] gh/coconutruben/39/orig -> origin/gh/coconutruben/39/orig 2025-09-07T07:38:46.3423976Z * [new branch] gh/coconutruben/40/base -> origin/gh/coconutruben/40/base 2025-09-07T07:38:46.3424431Z * [new branch] gh/coconutruben/40/head -> origin/gh/coconutruben/40/head 2025-09-07T07:38:46.3425017Z * [new branch] gh/coconutruben/40/orig -> origin/gh/coconutruben/40/orig 2025-09-07T07:38:46.3426085Z * [new branch] gh/coconutruben/41/base -> origin/gh/coconutruben/41/base 2025-09-07T07:38:46.3426726Z * [new branch] gh/coconutruben/41/head -> origin/gh/coconutruben/41/head 2025-09-07T07:38:46.3427252Z * [new branch] gh/coconutruben/41/orig -> origin/gh/coconutruben/41/orig 2025-09-07T07:38:46.3428244Z * [new branch] gh/coconutruben/42/base -> origin/gh/coconutruben/42/base 2025-09-07T07:38:46.3428759Z * [new branch] gh/coconutruben/42/head -> origin/gh/coconutruben/42/head 2025-09-07T07:38:46.3429348Z * [new branch] gh/coconutruben/42/orig -> origin/gh/coconutruben/42/orig 2025-09-07T07:38:46.3430356Z * [new branch] gh/coconutruben/43/base -> origin/gh/coconutruben/43/base 2025-09-07T07:38:46.3430843Z * [new branch] gh/coconutruben/43/head -> origin/gh/coconutruben/43/head 2025-09-07T07:38:46.3431384Z * [new branch] gh/coconutruben/43/orig -> origin/gh/coconutruben/43/orig 2025-09-07T07:38:46.3432463Z * [new branch] gh/coconutruben/44/base -> origin/gh/coconutruben/44/base 2025-09-07T07:38:46.3433032Z * [new branch] gh/coconutruben/44/head -> origin/gh/coconutruben/44/head 2025-09-07T07:38:46.3433792Z * [new branch] gh/coconutruben/44/orig -> origin/gh/coconutruben/44/orig 2025-09-07T07:38:46.3434733Z * [new branch] gh/coconutruben/45/base -> origin/gh/coconutruben/45/base 2025-09-07T07:38:46.3435227Z * [new branch] gh/coconutruben/45/head -> origin/gh/coconutruben/45/head 2025-09-07T07:38:46.3435806Z * [new branch] gh/coconutruben/45/orig -> origin/gh/coconutruben/45/orig 2025-09-07T07:38:46.3436648Z * [new branch] gh/coconutruben/46/base -> origin/gh/coconutruben/46/base 2025-09-07T07:38:46.3437272Z * [new branch] gh/coconutruben/46/head -> origin/gh/coconutruben/46/head 2025-09-07T07:38:46.3437871Z * [new branch] gh/coconutruben/46/orig -> origin/gh/coconutruben/46/orig 2025-09-07T07:38:46.3438873Z * [new branch] gh/coconutruben/47/base -> origin/gh/coconutruben/47/base 2025-09-07T07:38:46.3439341Z * [new branch] gh/coconutruben/47/head -> origin/gh/coconutruben/47/head 2025-09-07T07:38:46.3439931Z * [new branch] gh/coconutruben/47/orig -> origin/gh/coconutruben/47/orig 2025-09-07T07:38:46.3440975Z * [new branch] gh/coconutruben/48/base -> origin/gh/coconutruben/48/base 2025-09-07T07:38:46.3441501Z * [new branch] gh/coconutruben/48/head -> origin/gh/coconutruben/48/head 2025-09-07T07:38:46.3442104Z * [new branch] gh/coconutruben/48/orig -> origin/gh/coconutruben/48/orig 2025-09-07T07:38:46.3443173Z * [new branch] gh/coconutruben/49/base -> origin/gh/coconutruben/49/base 2025-09-07T07:38:46.3443674Z * [new branch] gh/coconutruben/49/head -> origin/gh/coconutruben/49/head 2025-09-07T07:38:46.3444212Z * [new branch] gh/coconutruben/49/orig -> origin/gh/coconutruben/49/orig 2025-09-07T07:38:46.3445126Z * [new branch] gh/coconutruben/50/base -> origin/gh/coconutruben/50/base 2025-09-07T07:38:46.3445657Z * [new branch] gh/coconutruben/50/head -> origin/gh/coconutruben/50/head 2025-09-07T07:38:46.3446284Z * [new branch] gh/coconutruben/50/orig -> origin/gh/coconutruben/50/orig 2025-09-07T07:38:46.3447110Z * [new branch] gh/coconutruben/51/base -> origin/gh/coconutruben/51/base 2025-09-07T07:38:46.3447703Z * [new branch] gh/coconutruben/51/head -> origin/gh/coconutruben/51/head 2025-09-07T07:38:46.3448318Z * [new branch] gh/coconutruben/51/orig -> origin/gh/coconutruben/51/orig 2025-09-07T07:38:46.3449393Z * [new branch] gh/coconutruben/52/base -> origin/gh/coconutruben/52/base 2025-09-07T07:38:46.3449928Z * [new branch] gh/coconutruben/52/head -> origin/gh/coconutruben/52/head 2025-09-07T07:38:46.3450750Z * [new branch] gh/coconutruben/52/orig -> origin/gh/coconutruben/52/orig 2025-09-07T07:38:46.3451623Z * [new branch] gh/coconutruben/53/base -> origin/gh/coconutruben/53/base 2025-09-07T07:38:46.3452056Z * [new branch] gh/coconutruben/53/head -> origin/gh/coconutruben/53/head 2025-09-07T07:38:46.3452585Z * [new branch] gh/coconutruben/53/orig -> origin/gh/coconutruben/53/orig 2025-09-07T07:38:46.3453431Z * [new branch] gh/coconutruben/54/base -> origin/gh/coconutruben/54/base 2025-09-07T07:38:46.3453948Z * [new branch] gh/coconutruben/54/head -> origin/gh/coconutruben/54/head 2025-09-07T07:38:46.3454503Z * [new branch] gh/coconutruben/54/orig -> origin/gh/coconutruben/54/orig 2025-09-07T07:38:46.3455435Z * [new branch] gh/coconutruben/55/base -> origin/gh/coconutruben/55/base 2025-09-07T07:38:46.3455920Z * [new branch] gh/coconutruben/55/head -> origin/gh/coconutruben/55/head 2025-09-07T07:38:46.3456530Z * [new branch] gh/coconutruben/55/orig -> origin/gh/coconutruben/55/orig 2025-09-07T07:38:46.3457424Z * [new branch] gh/coconutruben/56/base -> origin/gh/coconutruben/56/base 2025-09-07T07:38:46.3458060Z * [new branch] gh/coconutruben/56/head -> origin/gh/coconutruben/56/head 2025-09-07T07:38:46.3458714Z * [new branch] gh/coconutruben/56/orig -> origin/gh/coconutruben/56/orig 2025-09-07T07:38:46.3459651Z * [new branch] gh/coconutruben/57/base -> origin/gh/coconutruben/57/base 2025-09-07T07:38:46.3460226Z * [new branch] gh/coconutruben/57/head -> origin/gh/coconutruben/57/head 2025-09-07T07:38:46.3460794Z * [new branch] gh/coconutruben/57/orig -> origin/gh/coconutruben/57/orig 2025-09-07T07:38:46.3461919Z * [new branch] gh/coconutruben/58/base -> origin/gh/coconutruben/58/base 2025-09-07T07:38:46.3462504Z * [new branch] gh/coconutruben/58/head -> origin/gh/coconutruben/58/head 2025-09-07T07:38:46.3463068Z * [new branch] gh/coconutruben/58/orig -> origin/gh/coconutruben/58/orig 2025-09-07T07:38:46.3463915Z * [new branch] gh/coconutruben/59/base -> origin/gh/coconutruben/59/base 2025-09-07T07:38:46.3464308Z * [new branch] gh/coconutruben/59/head -> origin/gh/coconutruben/59/head 2025-09-07T07:38:46.3464845Z * [new branch] gh/coconutruben/59/orig -> origin/gh/coconutruben/59/orig 2025-09-07T07:38:46.3465723Z * [new branch] gh/coconutruben/60/base -> origin/gh/coconutruben/60/base 2025-09-07T07:38:46.3466251Z * [new branch] gh/coconutruben/60/head -> origin/gh/coconutruben/60/head 2025-09-07T07:38:46.3466901Z * [new branch] gh/coconutruben/60/orig -> origin/gh/coconutruben/60/orig 2025-09-07T07:38:46.3467752Z * [new branch] gh/coconutruben/61/base -> origin/gh/coconutruben/61/base 2025-09-07T07:38:46.3468400Z * [new branch] gh/coconutruben/61/head -> origin/gh/coconutruben/61/head 2025-09-07T07:38:46.3468946Z * [new branch] gh/coconutruben/61/orig -> origin/gh/coconutruben/61/orig 2025-09-07T07:38:46.3469964Z * [new branch] gh/coconutruben/62/base -> origin/gh/coconutruben/62/base 2025-09-07T07:38:46.3470435Z * [new branch] gh/coconutruben/62/head -> origin/gh/coconutruben/62/head 2025-09-07T07:38:46.3471085Z * [new branch] gh/coconutruben/62/orig -> origin/gh/coconutruben/62/orig 2025-09-07T07:38:46.3472094Z * [new branch] gh/coconutruben/63/base -> origin/gh/coconutruben/63/base 2025-09-07T07:38:46.3472618Z * [new branch] gh/coconutruben/63/head -> origin/gh/coconutruben/63/head 2025-09-07T07:38:46.3473181Z * [new branch] gh/coconutruben/63/orig -> origin/gh/coconutruben/63/orig 2025-09-07T07:38:46.3474064Z * [new branch] gh/coconutruben/64/base -> origin/gh/coconutruben/64/base 2025-09-07T07:38:46.3474577Z * [new branch] gh/coconutruben/64/head -> origin/gh/coconutruben/64/head 2025-09-07T07:38:46.3475189Z * [new branch] gh/coconutruben/64/orig -> origin/gh/coconutruben/64/orig 2025-09-07T07:38:46.3476093Z * [new branch] gh/coconutruben/65/base -> origin/gh/coconutruben/65/base 2025-09-07T07:38:46.3476578Z * [new branch] gh/coconutruben/65/head -> origin/gh/coconutruben/65/head 2025-09-07T07:38:46.3477161Z * [new branch] gh/coconutruben/65/orig -> origin/gh/coconutruben/65/orig 2025-09-07T07:38:46.3478190Z * [new branch] gh/coconutruben/66/base -> origin/gh/coconutruben/66/base 2025-09-07T07:38:46.3478601Z * [new branch] gh/coconutruben/66/head -> origin/gh/coconutruben/66/head 2025-09-07T07:38:46.3479108Z * [new branch] gh/coconutruben/66/orig -> origin/gh/coconutruben/66/orig 2025-09-07T07:38:46.3480503Z * [new branch] gh/codingwithsurya/12/base -> origin/gh/codingwithsurya/12/base 2025-09-07T07:38:46.3481137Z * [new branch] gh/codingwithsurya/12/head -> origin/gh/codingwithsurya/12/head 2025-09-07T07:38:46.3481825Z * [new branch] gh/codingwithsurya/12/orig -> origin/gh/codingwithsurya/12/orig 2025-09-07T07:38:46.3482674Z * [new branch] gh/codingwithsurya/14/base -> origin/gh/codingwithsurya/14/base 2025-09-07T07:38:46.3483195Z * [new branch] gh/codingwithsurya/14/head -> origin/gh/codingwithsurya/14/head 2025-09-07T07:38:46.3483693Z * [new branch] gh/codingwithsurya/14/orig -> origin/gh/codingwithsurya/14/orig 2025-09-07T07:38:46.3484704Z * [new branch] gh/codingwithsurya/15/base -> origin/gh/codingwithsurya/15/base 2025-09-07T07:38:46.3485204Z * [new branch] gh/codingwithsurya/15/head -> origin/gh/codingwithsurya/15/head 2025-09-07T07:38:46.3485780Z * [new branch] gh/codingwithsurya/15/orig -> origin/gh/codingwithsurya/15/orig 2025-09-07T07:38:46.3486703Z * [new branch] gh/codingwithsurya/16/base -> origin/gh/codingwithsurya/16/base 2025-09-07T07:38:46.3487185Z * [new branch] gh/codingwithsurya/16/head -> origin/gh/codingwithsurya/16/head 2025-09-07T07:38:46.3487714Z * [new branch] gh/codingwithsurya/16/orig -> origin/gh/codingwithsurya/16/orig 2025-09-07T07:38:46.3488906Z * [new branch] gh/codingwithsurya/17/base -> origin/gh/codingwithsurya/17/base 2025-09-07T07:38:46.3489416Z * [new branch] gh/codingwithsurya/17/head -> origin/gh/codingwithsurya/17/head 2025-09-07T07:38:46.3489943Z * [new branch] gh/codingwithsurya/17/orig -> origin/gh/codingwithsurya/17/orig 2025-09-07T07:38:46.3490877Z * [new branch] gh/codingwithsurya/18/base -> origin/gh/codingwithsurya/18/base 2025-09-07T07:38:46.3491365Z * [new branch] gh/codingwithsurya/18/head -> origin/gh/codingwithsurya/18/head 2025-09-07T07:38:46.3491884Z * [new branch] gh/codingwithsurya/18/orig -> origin/gh/codingwithsurya/18/orig 2025-09-07T07:38:46.3492885Z * [new branch] gh/codingwithsurya/19/base -> origin/gh/codingwithsurya/19/base 2025-09-07T07:38:46.3493356Z * [new branch] gh/codingwithsurya/19/head -> origin/gh/codingwithsurya/19/head 2025-09-07T07:38:46.3493879Z * [new branch] gh/codingwithsurya/19/orig -> origin/gh/codingwithsurya/19/orig 2025-09-07T07:38:46.3494717Z * [new branch] gh/codingwithsurya/20/base -> origin/gh/codingwithsurya/20/base 2025-09-07T07:38:46.3495257Z * [new branch] gh/codingwithsurya/20/head -> origin/gh/codingwithsurya/20/head 2025-09-07T07:38:46.3495687Z * [new branch] gh/codingwithsurya/20/orig -> origin/gh/codingwithsurya/20/orig 2025-09-07T07:38:46.3496642Z * [new branch] gh/codingwithsurya/21/base -> origin/gh/codingwithsurya/21/base 2025-09-07T07:38:46.3497163Z * [new branch] gh/codingwithsurya/21/head -> origin/gh/codingwithsurya/21/head 2025-09-07T07:38:46.3497673Z * [new branch] gh/codingwithsurya/21/orig -> origin/gh/codingwithsurya/21/orig 2025-09-07T07:38:46.3498917Z * [new branch] gh/colinchan15/1/base -> origin/gh/colinchan15/1/base 2025-09-07T07:38:46.3499345Z * [new branch] gh/colinchan15/1/head -> origin/gh/colinchan15/1/head 2025-09-07T07:38:46.3499999Z * [new branch] gh/colinchan15/2/base -> origin/gh/colinchan15/2/base 2025-09-07T07:38:46.3500466Z * [new branch] gh/colinchan15/2/head -> origin/gh/colinchan15/2/head 2025-09-07T07:38:46.3501224Z * [new branch] gh/colinchan15/3/base -> origin/gh/colinchan15/3/base 2025-09-07T07:38:46.3501587Z * [new branch] gh/colinchan15/3/head -> origin/gh/colinchan15/3/head 2025-09-07T07:38:46.3502349Z * [new branch] gh/colinchan15/6/base -> origin/gh/colinchan15/6/base 2025-09-07T07:38:46.3503216Z * [new branch] gh/colinchan15/6/head -> origin/gh/colinchan15/6/head 2025-09-07T07:38:46.3504296Z * [new branch] gh/davidberard98/382/base -> origin/gh/davidberard98/382/base 2025-09-07T07:38:46.3504882Z * [new branch] gh/davidberard98/382/head -> origin/gh/davidberard98/382/head 2025-09-07T07:38:46.3505410Z * [new branch] gh/davidberard98/382/orig -> origin/gh/davidberard98/382/orig 2025-09-07T07:38:46.3506299Z * [new branch] gh/davidberard98/386/base -> origin/gh/davidberard98/386/base 2025-09-07T07:38:46.3506776Z * [new branch] gh/davidberard98/386/head -> origin/gh/davidberard98/386/head 2025-09-07T07:38:46.3507311Z * [new branch] gh/davidberard98/386/orig -> origin/gh/davidberard98/386/orig 2025-09-07T07:38:46.3508181Z * [new branch] gh/davidberard98/391/base -> origin/gh/davidberard98/391/base 2025-09-07T07:38:46.3508766Z * [new branch] gh/davidberard98/391/head -> origin/gh/davidberard98/391/head 2025-09-07T07:38:46.3509232Z * [new branch] gh/davidberard98/391/orig -> origin/gh/davidberard98/391/orig 2025-09-07T07:38:46.3510081Z * [new branch] gh/davidberard98/392/base -> origin/gh/davidberard98/392/base 2025-09-07T07:38:46.3510521Z * [new branch] gh/davidberard98/392/head -> origin/gh/davidberard98/392/head 2025-09-07T07:38:46.3511058Z * [new branch] gh/davidberard98/392/orig -> origin/gh/davidberard98/392/orig 2025-09-07T07:38:46.3512011Z * [new branch] gh/davidberard98/394/base -> origin/gh/davidberard98/394/base 2025-09-07T07:38:46.3512486Z * [new branch] gh/davidberard98/394/head -> origin/gh/davidberard98/394/head 2025-09-07T07:38:46.3513030Z * [new branch] gh/davidberard98/394/orig -> origin/gh/davidberard98/394/orig 2025-09-07T07:38:46.3513934Z * [new branch] gh/davidberard98/396/base -> origin/gh/davidberard98/396/base 2025-09-07T07:38:46.3514381Z * [new branch] gh/davidberard98/396/head -> origin/gh/davidberard98/396/head 2025-09-07T07:38:46.3514911Z * [new branch] gh/davidberard98/396/orig -> origin/gh/davidberard98/396/orig 2025-09-07T07:38:46.3515902Z * [new branch] gh/davidberard98/397/base -> origin/gh/davidberard98/397/base 2025-09-07T07:38:46.3516362Z * [new branch] gh/davidberard98/397/head -> origin/gh/davidberard98/397/head 2025-09-07T07:38:46.3516899Z * [new branch] gh/davidberard98/397/orig -> origin/gh/davidberard98/397/orig 2025-09-07T07:38:46.3517905Z * [new branch] gh/davidberard98/398/base -> origin/gh/davidberard98/398/base 2025-09-07T07:38:46.3518268Z * [new branch] gh/davidberard98/398/head -> origin/gh/davidberard98/398/head 2025-09-07T07:38:46.3518854Z * [new branch] gh/davidberard98/398/orig -> origin/gh/davidberard98/398/orig 2025-09-07T07:38:46.3519772Z * [new branch] gh/davidberard98/399/base -> origin/gh/davidberard98/399/base 2025-09-07T07:38:46.3520296Z * [new branch] gh/davidberard98/399/head -> origin/gh/davidberard98/399/head 2025-09-07T07:38:46.3520807Z * [new branch] gh/davidberard98/399/orig -> origin/gh/davidberard98/399/orig 2025-09-07T07:38:46.3521743Z * [new branch] gh/davidberard98/400/base -> origin/gh/davidberard98/400/base 2025-09-07T07:38:46.3522227Z * [new branch] gh/davidberard98/400/head -> origin/gh/davidberard98/400/head 2025-09-07T07:38:46.3522744Z * [new branch] gh/davidberard98/400/orig -> origin/gh/davidberard98/400/orig 2025-09-07T07:38:46.3523570Z * [new branch] gh/davidberard98/401/base -> origin/gh/davidberard98/401/base 2025-09-07T07:38:46.3523985Z * [new branch] gh/davidberard98/401/head -> origin/gh/davidberard98/401/head 2025-09-07T07:38:46.3524537Z * [new branch] gh/davidberard98/401/orig -> origin/gh/davidberard98/401/orig 2025-09-07T07:38:46.3525353Z * [new branch] gh/davidberard98/402/base -> origin/gh/davidberard98/402/base 2025-09-07T07:38:46.3525788Z * [new branch] gh/davidberard98/402/head -> origin/gh/davidberard98/402/head 2025-09-07T07:38:46.3526345Z * [new branch] gh/davidberard98/402/orig -> origin/gh/davidberard98/402/orig 2025-09-07T07:38:46.3527306Z * [new branch] gh/davidberard98/403/base -> origin/gh/davidberard98/403/base 2025-09-07T07:38:46.3527750Z * [new branch] gh/davidberard98/403/head -> origin/gh/davidberard98/403/head 2025-09-07T07:38:46.3528257Z * [new branch] gh/davidberard98/403/orig -> origin/gh/davidberard98/403/orig 2025-09-07T07:38:46.3529225Z * [new branch] gh/davidberard98/404/base -> origin/gh/davidberard98/404/base 2025-09-07T07:38:46.3529686Z * [new branch] gh/davidberard98/404/head -> origin/gh/davidberard98/404/head 2025-09-07T07:38:46.3530153Z * [new branch] gh/davidberard98/404/orig -> origin/gh/davidberard98/404/orig 2025-09-07T07:38:46.3530983Z * [new branch] gh/davidberard98/405/base -> origin/gh/davidberard98/405/base 2025-09-07T07:38:46.3531477Z * [new branch] gh/davidberard98/405/head -> origin/gh/davidberard98/405/head 2025-09-07T07:38:46.3532119Z * [new branch] gh/davidberard98/405/orig -> origin/gh/davidberard98/405/orig 2025-09-07T07:38:46.3533113Z * [new branch] gh/davidberard98/406/base -> origin/gh/davidberard98/406/base 2025-09-07T07:38:46.3533662Z * [new branch] gh/davidberard98/406/head -> origin/gh/davidberard98/406/head 2025-09-07T07:38:46.3534266Z * [new branch] gh/davidberard98/406/orig -> origin/gh/davidberard98/406/orig 2025-09-07T07:38:46.3535472Z * [new branch] gh/davidberard98/407/base -> origin/gh/davidberard98/407/base 2025-09-07T07:38:46.3535904Z * [new branch] gh/davidberard98/407/head -> origin/gh/davidberard98/407/head 2025-09-07T07:38:46.3536414Z * [new branch] gh/davidberard98/407/orig -> origin/gh/davidberard98/407/orig 2025-09-07T07:38:46.3537449Z * [new branch] gh/davidberard98/408/base -> origin/gh/davidberard98/408/base 2025-09-07T07:38:46.3537854Z * [new branch] gh/davidberard98/408/head -> origin/gh/davidberard98/408/head 2025-09-07T07:38:46.3538387Z * [new branch] gh/davidberard98/408/orig -> origin/gh/davidberard98/408/orig 2025-09-07T07:38:46.3539159Z * [new branch] gh/davidberard98/409/base -> origin/gh/davidberard98/409/base 2025-09-07T07:38:46.3540016Z * [new branch] gh/davidberard98/409/head -> origin/gh/davidberard98/409/head 2025-09-07T07:38:46.3540390Z * [new branch] gh/davidberard98/409/orig -> origin/gh/davidberard98/409/orig 2025-09-07T07:38:46.3541271Z * [new branch] gh/desertfire/594/base -> origin/gh/desertfire/594/base 2025-09-07T07:38:46.3541651Z * [new branch] gh/desertfire/594/head -> origin/gh/desertfire/594/head 2025-09-07T07:38:46.3542228Z * [new branch] gh/desertfire/594/orig -> origin/gh/desertfire/594/orig 2025-09-07T07:38:46.3543005Z * [new branch] gh/desertfire/595/base -> origin/gh/desertfire/595/base 2025-09-07T07:38:46.3543433Z * [new branch] gh/desertfire/595/head -> origin/gh/desertfire/595/head 2025-09-07T07:38:46.3544009Z * [new branch] gh/desertfire/595/orig -> origin/gh/desertfire/595/orig 2025-09-07T07:38:46.3544819Z * [new branch] gh/desertfire/597/base -> origin/gh/desertfire/597/base 2025-09-07T07:38:46.3545257Z * [new branch] gh/desertfire/597/head -> origin/gh/desertfire/597/head 2025-09-07T07:38:46.3545766Z * [new branch] gh/desertfire/597/orig -> origin/gh/desertfire/597/orig 2025-09-07T07:38:46.3546887Z * [new branch] gh/dharakk/1/base -> origin/gh/dharakk/1/base 2025-09-07T07:38:46.3547372Z * [new branch] gh/dharakk/1/head -> origin/gh/dharakk/1/head 2025-09-07T07:38:46.3548331Z * [new branch] gh/drisspg/149/base -> origin/gh/drisspg/149/base 2025-09-07T07:38:46.3548769Z * [new branch] gh/drisspg/149/head -> origin/gh/drisspg/149/head 2025-09-07T07:38:46.3549275Z * [new branch] gh/drisspg/149/orig -> origin/gh/drisspg/149/orig 2025-09-07T07:38:46.3550080Z * [new branch] gh/drisspg/159/base -> origin/gh/drisspg/159/base 2025-09-07T07:38:46.3550563Z * [new branch] gh/drisspg/159/head -> origin/gh/drisspg/159/head 2025-09-07T07:38:46.3551079Z * [new branch] gh/drisspg/159/orig -> origin/gh/drisspg/159/orig 2025-09-07T07:38:46.3551891Z * [new branch] gh/drisspg/166/base -> origin/gh/drisspg/166/base 2025-09-07T07:38:46.3552299Z * [new branch] gh/drisspg/166/head -> origin/gh/drisspg/166/head 2025-09-07T07:38:46.3552807Z * [new branch] gh/drisspg/166/orig -> origin/gh/drisspg/166/orig 2025-09-07T07:38:46.3553562Z * [new branch] gh/drisspg/170/base -> origin/gh/drisspg/170/base 2025-09-07T07:38:46.3554004Z * [new branch] gh/drisspg/170/head -> origin/gh/drisspg/170/head 2025-09-07T07:38:46.3554527Z * [new branch] gh/drisspg/170/orig -> origin/gh/drisspg/170/orig 2025-09-07T07:38:46.3555430Z * [new branch] gh/drisspg/173/base -> origin/gh/drisspg/173/base 2025-09-07T07:38:46.3555841Z * [new branch] gh/drisspg/173/head -> origin/gh/drisspg/173/head 2025-09-07T07:38:46.3556348Z * [new branch] gh/drisspg/173/orig -> origin/gh/drisspg/173/orig 2025-09-07T07:38:46.3557157Z * [new branch] gh/drisspg/177/base -> origin/gh/drisspg/177/base 2025-09-07T07:38:46.3557659Z * [new branch] gh/drisspg/177/head -> origin/gh/drisspg/177/head 2025-09-07T07:38:46.3558209Z * [new branch] gh/drisspg/177/orig -> origin/gh/drisspg/177/orig 2025-09-07T07:38:46.3558964Z * [new branch] gh/drisspg/178/base -> origin/gh/drisspg/178/base 2025-09-07T07:38:46.3559406Z * [new branch] gh/drisspg/178/head -> origin/gh/drisspg/178/head 2025-09-07T07:38:46.3559816Z * [new branch] gh/drisspg/178/orig -> origin/gh/drisspg/178/orig 2025-09-07T07:38:46.3560603Z * [new branch] gh/drisspg/180/base -> origin/gh/drisspg/180/base 2025-09-07T07:38:46.3561034Z * [new branch] gh/drisspg/180/head -> origin/gh/drisspg/180/head 2025-09-07T07:38:46.3561538Z * [new branch] gh/drisspg/180/orig -> origin/gh/drisspg/180/orig 2025-09-07T07:38:46.3562289Z * [new branch] gh/drisspg/181/base -> origin/gh/drisspg/181/base 2025-09-07T07:38:46.3562722Z * [new branch] gh/drisspg/181/head -> origin/gh/drisspg/181/head 2025-09-07T07:38:46.3563209Z * [new branch] gh/drisspg/181/orig -> origin/gh/drisspg/181/orig 2025-09-07T07:38:46.3564068Z * [new branch] gh/drisspg/182/base -> origin/gh/drisspg/182/base 2025-09-07T07:38:46.3564636Z * [new branch] gh/drisspg/182/head -> origin/gh/drisspg/182/head 2025-09-07T07:38:46.3565267Z * [new branch] gh/drisspg/183/base -> origin/gh/drisspg/183/base 2025-09-07T07:38:46.3565716Z * [new branch] gh/drisspg/183/head -> origin/gh/drisspg/183/head 2025-09-07T07:38:46.3566464Z * [new branch] gh/drisspg/184/base -> origin/gh/drisspg/184/base 2025-09-07T07:38:46.3566855Z * [new branch] gh/drisspg/184/head -> origin/gh/drisspg/184/head 2025-09-07T07:38:46.3568309Z * [new branch] gh/drisspg/185/base -> origin/gh/drisspg/185/base 2025-09-07T07:38:46.3568688Z * [new branch] gh/drisspg/185/head -> origin/gh/drisspg/185/head 2025-09-07T07:38:46.3569006Z * [new branch] gh/drisspg/186/base -> origin/gh/drisspg/186/base 2025-09-07T07:38:46.3569313Z * [new branch] gh/drisspg/186/head -> origin/gh/drisspg/186/head 2025-09-07T07:38:46.3569747Z * [new branch] gh/drisspg/186/orig -> origin/gh/drisspg/186/orig 2025-09-07T07:38:46.3570541Z * [new branch] gh/drisspg/187/base -> origin/gh/drisspg/187/base 2025-09-07T07:38:46.3571041Z * [new branch] gh/drisspg/187/head -> origin/gh/drisspg/187/head 2025-09-07T07:38:46.3571549Z * [new branch] gh/drisspg/187/orig -> origin/gh/drisspg/187/orig 2025-09-07T07:38:46.3572344Z * [new branch] gh/drisspg/188/base -> origin/gh/drisspg/188/base 2025-09-07T07:38:46.3573071Z * [new branch] gh/drisspg/188/head -> origin/gh/drisspg/188/head 2025-09-07T07:38:46.3573449Z * [new branch] gh/drisspg/188/orig -> origin/gh/drisspg/188/orig 2025-09-07T07:38:46.3574508Z * [new branch] gh/drisspg/189/base -> origin/gh/drisspg/189/base 2025-09-07T07:38:46.3574913Z * [new branch] gh/drisspg/189/head -> origin/gh/drisspg/189/head 2025-09-07T07:38:46.3575466Z * [new branch] gh/drisspg/189/orig -> origin/gh/drisspg/189/orig 2025-09-07T07:38:46.3576288Z * [new branch] gh/drisspg/190/base -> origin/gh/drisspg/190/base 2025-09-07T07:38:46.3576844Z * [new branch] gh/drisspg/190/head -> origin/gh/drisspg/190/head 2025-09-07T07:38:46.3577223Z * [new branch] gh/drisspg/190/orig -> origin/gh/drisspg/190/orig 2025-09-07T07:38:46.3578015Z * [new branch] gh/drisspg/191/base -> origin/gh/drisspg/191/base 2025-09-07T07:38:46.3578516Z * [new branch] gh/drisspg/191/head -> origin/gh/drisspg/191/head 2025-09-07T07:38:46.3579036Z * [new branch] gh/drisspg/191/orig -> origin/gh/drisspg/191/orig 2025-09-07T07:38:46.3579812Z * [new branch] gh/drisspg/192/base -> origin/gh/drisspg/192/base 2025-09-07T07:38:46.3580228Z * [new branch] gh/drisspg/192/head -> origin/gh/drisspg/192/head 2025-09-07T07:38:46.3580733Z * [new branch] gh/drisspg/192/orig -> origin/gh/drisspg/192/orig 2025-09-07T07:38:46.3584911Z * [new branch] gh/drisspg/193/base -> origin/gh/drisspg/193/base 2025-09-07T07:38:46.3585725Z * [new branch] gh/drisspg/193/head -> origin/gh/drisspg/193/head 2025-09-07T07:38:46.3586162Z * [new branch] gh/drisspg/193/orig -> origin/gh/drisspg/193/orig 2025-09-07T07:38:46.3586813Z * [new branch] gh/drisspg/194/base -> origin/gh/drisspg/194/base 2025-09-07T07:38:46.3587315Z * [new branch] gh/drisspg/194/head -> origin/gh/drisspg/194/head 2025-09-07T07:38:46.3587833Z * [new branch] gh/drisspg/194/orig -> origin/gh/drisspg/194/orig 2025-09-07T07:38:46.3588637Z * [new branch] gh/drisspg/195/base -> origin/gh/drisspg/195/base 2025-09-07T07:38:46.3589114Z * [new branch] gh/drisspg/195/head -> origin/gh/drisspg/195/head 2025-09-07T07:38:46.3589622Z * [new branch] gh/drisspg/195/orig -> origin/gh/drisspg/195/orig 2025-09-07T07:38:46.3590421Z * [new branch] gh/drisspg/196/base -> origin/gh/drisspg/196/base 2025-09-07T07:38:46.3590811Z * [new branch] gh/drisspg/196/head -> origin/gh/drisspg/196/head 2025-09-07T07:38:46.3591316Z * [new branch] gh/drisspg/196/orig -> origin/gh/drisspg/196/orig 2025-09-07T07:38:46.3592128Z * [new branch] gh/drisspg/197/base -> origin/gh/drisspg/197/base 2025-09-07T07:38:46.3592487Z * [new branch] gh/drisspg/197/head -> origin/gh/drisspg/197/head 2025-09-07T07:38:46.3593022Z * [new branch] gh/drisspg/197/orig -> origin/gh/drisspg/197/orig 2025-09-07T07:38:46.3593823Z * [new branch] gh/drisspg/198/base -> origin/gh/drisspg/198/base 2025-09-07T07:38:46.3594365Z * [new branch] gh/drisspg/198/head -> origin/gh/drisspg/198/head 2025-09-07T07:38:46.3594887Z * [new branch] gh/drisspg/198/orig -> origin/gh/drisspg/198/orig 2025-09-07T07:38:46.3595659Z * [new branch] gh/drisspg/199/base -> origin/gh/drisspg/199/base 2025-09-07T07:38:46.3596145Z * [new branch] gh/drisspg/199/head -> origin/gh/drisspg/199/head 2025-09-07T07:38:46.3596630Z * [new branch] gh/drisspg/199/orig -> origin/gh/drisspg/199/orig 2025-09-07T07:38:46.3597824Z * [new branch] gh/dsjohns2/1/base -> origin/gh/dsjohns2/1/base 2025-09-07T07:38:46.3598296Z * [new branch] gh/dsjohns2/1/head -> origin/gh/dsjohns2/1/head 2025-09-07T07:38:46.3599150Z * [new branch] gh/eellison/784/base -> origin/gh/eellison/784/base 2025-09-07T07:38:46.3599522Z * [new branch] gh/eellison/784/head -> origin/gh/eellison/784/head 2025-09-07T07:38:46.3600047Z * [new branch] gh/eellison/784/orig -> origin/gh/eellison/784/orig 2025-09-07T07:38:46.3601042Z * [new branch] gh/eellison/785/base -> origin/gh/eellison/785/base 2025-09-07T07:38:46.3601450Z * [new branch] gh/eellison/785/head -> origin/gh/eellison/785/head 2025-09-07T07:38:46.3601981Z * [new branch] gh/eellison/785/orig -> origin/gh/eellison/785/orig 2025-09-07T07:38:46.3602842Z * [new branch] gh/eellison/789/base -> origin/gh/eellison/789/base 2025-09-07T07:38:46.3603250Z * [new branch] gh/eellison/789/head -> origin/gh/eellison/789/head 2025-09-07T07:38:46.3604020Z * [new branch] gh/eellison/789/orig -> origin/gh/eellison/789/orig 2025-09-07T07:38:46.3604618Z * [new branch] gh/eellison/800/base -> origin/gh/eellison/800/base 2025-09-07T07:38:46.3605113Z * [new branch] gh/eellison/800/head -> origin/gh/eellison/800/head 2025-09-07T07:38:46.3605651Z * [new branch] gh/eellison/800/orig -> origin/gh/eellison/800/orig 2025-09-07T07:38:46.3606453Z * [new branch] gh/eellison/801/base -> origin/gh/eellison/801/base 2025-09-07T07:38:46.3606811Z * [new branch] gh/eellison/801/head -> origin/gh/eellison/801/head 2025-09-07T07:38:46.3607332Z * [new branch] gh/eellison/801/orig -> origin/gh/eellison/801/orig 2025-09-07T07:38:46.3608132Z * [new branch] gh/eellison/802/base -> origin/gh/eellison/802/base 2025-09-07T07:38:46.3608543Z * [new branch] gh/eellison/802/head -> origin/gh/eellison/802/head 2025-09-07T07:38:46.3609080Z * [new branch] gh/eellison/802/orig -> origin/gh/eellison/802/orig 2025-09-07T07:38:46.3609918Z * [new branch] gh/eellison/805/base -> origin/gh/eellison/805/base 2025-09-07T07:38:46.3610317Z * [new branch] gh/eellison/805/head -> origin/gh/eellison/805/head 2025-09-07T07:38:46.3610861Z * [new branch] gh/eellison/805/orig -> origin/gh/eellison/805/orig 2025-09-07T07:38:46.3611707Z * [new branch] gh/eellison/808/base -> origin/gh/eellison/808/base 2025-09-07T07:38:46.3612285Z * [new branch] gh/eellison/808/head -> origin/gh/eellison/808/head 2025-09-07T07:38:46.3612806Z * [new branch] gh/eellison/808/orig -> origin/gh/eellison/808/orig 2025-09-07T07:38:46.3613608Z * [new branch] gh/eellison/809/base -> origin/gh/eellison/809/base 2025-09-07T07:38:46.3613986Z * [new branch] gh/eellison/809/head -> origin/gh/eellison/809/head 2025-09-07T07:38:46.3614499Z * [new branch] gh/eellison/809/orig -> origin/gh/eellison/809/orig 2025-09-07T07:38:46.3615322Z * [new branch] gh/eellison/813/base -> origin/gh/eellison/813/base 2025-09-07T07:38:46.3615683Z * [new branch] gh/eellison/813/head -> origin/gh/eellison/813/head 2025-09-07T07:38:46.3616193Z * [new branch] gh/eellison/813/orig -> origin/gh/eellison/813/orig 2025-09-07T07:38:46.3617128Z * [new branch] gh/eellison/814/base -> origin/gh/eellison/814/base 2025-09-07T07:38:46.3617499Z * [new branch] gh/eellison/814/head -> origin/gh/eellison/814/head 2025-09-07T07:38:46.3618019Z * [new branch] gh/eellison/814/orig -> origin/gh/eellison/814/orig 2025-09-07T07:38:46.3619246Z * [new branch] gh/eellison/815/base -> origin/gh/eellison/815/base 2025-09-07T07:38:46.3619665Z * [new branch] gh/eellison/815/head -> origin/gh/eellison/815/head 2025-09-07T07:38:46.3620166Z * [new branch] gh/eellison/815/orig -> origin/gh/eellison/815/orig 2025-09-07T07:38:46.3621018Z * [new branch] gh/eellison/816/base -> origin/gh/eellison/816/base 2025-09-07T07:38:46.3621491Z * [new branch] gh/eellison/816/head -> origin/gh/eellison/816/head 2025-09-07T07:38:46.3622017Z * [new branch] gh/eellison/816/orig -> origin/gh/eellison/816/orig 2025-09-07T07:38:46.3622774Z * [new branch] gh/eellison/817/base -> origin/gh/eellison/817/base 2025-09-07T07:38:46.3623203Z * [new branch] gh/eellison/817/head -> origin/gh/eellison/817/head 2025-09-07T07:38:46.3623706Z * [new branch] gh/eellison/817/orig -> origin/gh/eellison/817/orig 2025-09-07T07:38:46.3624595Z * [new branch] gh/eellison/818/base -> origin/gh/eellison/818/base 2025-09-07T07:38:46.3625044Z * [new branch] gh/eellison/818/head -> origin/gh/eellison/818/head 2025-09-07T07:38:46.3625547Z * [new branch] gh/eellison/818/orig -> origin/gh/eellison/818/orig 2025-09-07T07:38:46.3626507Z * [new branch] gh/eellison/819/base -> origin/gh/eellison/819/base 2025-09-07T07:38:46.3626887Z * [new branch] gh/eellison/819/head -> origin/gh/eellison/819/head 2025-09-07T07:38:46.3627420Z * [new branch] gh/eellison/819/orig -> origin/gh/eellison/819/orig 2025-09-07T07:38:46.3628619Z * [new branch] gh/eellison/820/base -> origin/gh/eellison/820/base 2025-09-07T07:38:46.3629125Z * [new branch] gh/eellison/820/head -> origin/gh/eellison/820/head 2025-09-07T07:38:46.3629640Z * [new branch] gh/eellison/820/orig -> origin/gh/eellison/820/orig 2025-09-07T07:38:46.3630392Z * [new branch] gh/eellison/821/base -> origin/gh/eellison/821/base 2025-09-07T07:38:46.3630970Z * [new branch] gh/eellison/821/head -> origin/gh/eellison/821/head 2025-09-07T07:38:46.3631720Z * [new branch] gh/eellison/821/orig -> origin/gh/eellison/821/orig 2025-09-07T07:38:46.3632322Z * [new branch] gh/eellison/822/base -> origin/gh/eellison/822/base 2025-09-07T07:38:46.3632821Z * [new branch] gh/eellison/822/head -> origin/gh/eellison/822/head 2025-09-07T07:38:46.3633325Z * [new branch] gh/eellison/822/orig -> origin/gh/eellison/822/orig 2025-09-07T07:38:46.3634130Z * [new branch] gh/eellison/823/base -> origin/gh/eellison/823/base 2025-09-07T07:38:46.3634553Z * [new branch] gh/eellison/823/head -> origin/gh/eellison/823/head 2025-09-07T07:38:46.3635130Z * [new branch] gh/eellison/823/orig -> origin/gh/eellison/823/orig 2025-09-07T07:38:46.3636040Z * [new branch] gh/etaf/132/base -> origin/gh/etaf/132/base 2025-09-07T07:38:46.3636476Z * [new branch] gh/etaf/132/head -> origin/gh/etaf/132/head 2025-09-07T07:38:46.3637114Z * [new branch] gh/etaf/132/orig -> origin/gh/etaf/132/orig 2025-09-07T07:38:46.3637807Z * [new branch] gh/etaf/138/base -> origin/gh/etaf/138/base 2025-09-07T07:38:46.3638238Z * [new branch] gh/etaf/138/head -> origin/gh/etaf/138/head 2025-09-07T07:38:46.3638877Z * [new branch] gh/etaf/138/orig -> origin/gh/etaf/138/orig 2025-09-07T07:38:46.3639654Z * [new branch] gh/etaf/140/base -> origin/gh/etaf/140/base 2025-09-07T07:38:46.3640280Z * [new branch] gh/etaf/140/head -> origin/gh/etaf/140/head 2025-09-07T07:38:46.3640794Z * [new branch] gh/etaf/140/orig -> origin/gh/etaf/140/orig 2025-09-07T07:38:46.3641565Z * [new branch] gh/etaf/143/base -> origin/gh/etaf/143/base 2025-09-07T07:38:46.3641968Z * [new branch] gh/etaf/143/head -> origin/gh/etaf/143/head 2025-09-07T07:38:46.3642568Z * [new branch] gh/etaf/143/orig -> origin/gh/etaf/143/orig 2025-09-07T07:38:46.3643313Z * [new branch] gh/etaf/147/base -> origin/gh/etaf/147/base 2025-09-07T07:38:46.3643754Z * [new branch] gh/etaf/147/head -> origin/gh/etaf/147/head 2025-09-07T07:38:46.3644600Z * [new branch] gh/etaf/151/base -> origin/gh/etaf/151/base 2025-09-07T07:38:46.3645240Z * [new branch] gh/etaf/151/head -> origin/gh/etaf/151/head 2025-09-07T07:38:46.3645673Z * [new branch] gh/etaf/151/orig -> origin/gh/etaf/151/orig 2025-09-07T07:38:46.3646638Z * [new branch] gh/etaf/152/base -> origin/gh/etaf/152/base 2025-09-07T07:38:46.3647203Z * [new branch] gh/etaf/152/head -> origin/gh/etaf/152/head 2025-09-07T07:38:46.3647650Z * [new branch] gh/etaf/152/orig -> origin/gh/etaf/152/orig 2025-09-07T07:38:46.3648473Z * [new branch] gh/etaf/153/base -> origin/gh/etaf/153/base 2025-09-07T07:38:46.3648900Z * [new branch] gh/etaf/153/head -> origin/gh/etaf/153/head 2025-09-07T07:38:46.3649611Z * [new branch] gh/etaf/153/orig -> origin/gh/etaf/153/orig 2025-09-07T07:38:46.3650492Z * [new branch] gh/etaf/154/base -> origin/gh/etaf/154/base 2025-09-07T07:38:46.3651057Z * [new branch] gh/etaf/154/head -> origin/gh/etaf/154/head 2025-09-07T07:38:46.3651438Z * [new branch] gh/etaf/154/orig -> origin/gh/etaf/154/orig 2025-09-07T07:38:46.3652301Z * [new branch] gh/etaf/155/base -> origin/gh/etaf/155/base 2025-09-07T07:38:46.3652871Z * [new branch] gh/etaf/155/head -> origin/gh/etaf/155/head 2025-09-07T07:38:46.3653290Z * [new branch] gh/etaf/155/orig -> origin/gh/etaf/155/orig 2025-09-07T07:38:46.3654100Z * [new branch] gh/etaf/156/base -> origin/gh/etaf/156/base 2025-09-07T07:38:46.3654542Z * [new branch] gh/etaf/156/head -> origin/gh/etaf/156/head 2025-09-07T07:38:46.3655109Z * [new branch] gh/etaf/156/orig -> origin/gh/etaf/156/orig 2025-09-07T07:38:46.3655962Z * [new branch] gh/etaf/157/base -> origin/gh/etaf/157/base 2025-09-07T07:38:46.3656544Z * [new branch] gh/etaf/157/head -> origin/gh/etaf/157/head 2025-09-07T07:38:46.3656986Z * [new branch] gh/etaf/157/orig -> origin/gh/etaf/157/orig 2025-09-07T07:38:46.3657753Z * [new branch] gh/etaf/158/base -> origin/gh/etaf/158/base 2025-09-07T07:38:46.3658427Z * [new branch] gh/etaf/158/head -> origin/gh/etaf/158/head 2025-09-07T07:38:46.3658863Z * [new branch] gh/etaf/158/orig -> origin/gh/etaf/158/orig 2025-09-07T07:38:46.3659652Z * [new branch] gh/etaf/159/base -> origin/gh/etaf/159/base 2025-09-07T07:38:46.3660224Z * [new branch] gh/etaf/159/head -> origin/gh/etaf/159/head 2025-09-07T07:38:46.3660652Z * [new branch] gh/etaf/159/orig -> origin/gh/etaf/159/orig 2025-09-07T07:38:46.3661582Z * [new branch] gh/etaf/160/base -> origin/gh/etaf/160/base 2025-09-07T07:38:46.3662149Z * [new branch] gh/etaf/160/head -> origin/gh/etaf/160/head 2025-09-07T07:38:46.3662582Z * [new branch] gh/etaf/160/orig -> origin/gh/etaf/160/orig 2025-09-07T07:38:46.3663397Z * [new branch] gh/etaf/161/base -> origin/gh/etaf/161/base 2025-09-07T07:38:46.3663985Z * [new branch] gh/etaf/161/head -> origin/gh/etaf/161/head 2025-09-07T07:38:46.3664399Z * [new branch] gh/etaf/161/orig -> origin/gh/etaf/161/orig 2025-09-07T07:38:46.3665413Z * [new branch] gh/etaf/162/base -> origin/gh/etaf/162/base 2025-09-07T07:38:46.3665855Z * [new branch] gh/etaf/162/head -> origin/gh/etaf/162/head 2025-09-07T07:38:46.3666425Z * [new branch] gh/etaf/162/orig -> origin/gh/etaf/162/orig 2025-09-07T07:38:46.3667227Z * [new branch] gh/etaf/163/base -> origin/gh/etaf/163/base 2025-09-07T07:38:46.3667868Z * [new branch] gh/etaf/163/head -> origin/gh/etaf/163/head 2025-09-07T07:38:46.3668265Z * [new branch] gh/etaf/163/orig -> origin/gh/etaf/163/orig 2025-09-07T07:38:46.3669195Z * [new branch] gh/etaf/164/base -> origin/gh/etaf/164/base 2025-09-07T07:38:46.3669837Z * [new branch] gh/etaf/164/head -> origin/gh/etaf/164/head 2025-09-07T07:38:46.3670156Z * [new branch] gh/etaf/164/orig -> origin/gh/etaf/164/orig 2025-09-07T07:38:46.3671004Z * [new branch] gh/etaf/165/base -> origin/gh/etaf/165/base 2025-09-07T07:38:46.3671424Z * [new branch] gh/etaf/165/orig -> origin/gh/etaf/165/orig 2025-09-07T07:38:46.3672225Z * [new branch] gh/etaf/166/base -> origin/gh/etaf/166/base 2025-09-07T07:38:46.3672785Z * [new branch] gh/etaf/166/head -> origin/gh/etaf/166/head 2025-09-07T07:38:46.3673211Z * [new branch] gh/etaf/166/orig -> origin/gh/etaf/166/orig 2025-09-07T07:38:46.3674219Z * [new branch] gh/etaf/167/base -> origin/gh/etaf/167/base 2025-09-07T07:38:46.3674640Z * [new branch] gh/etaf/167/head -> origin/gh/etaf/167/head 2025-09-07T07:38:46.3675220Z * [new branch] gh/etaf/167/orig -> origin/gh/etaf/167/orig 2025-09-07T07:38:46.3675947Z * [new branch] gh/etaf/168/base -> origin/gh/etaf/168/base 2025-09-07T07:38:46.3676625Z * [new branch] gh/etaf/168/head -> origin/gh/etaf/168/head 2025-09-07T07:38:46.3677250Z * [new branch] gh/etaf/168/orig -> origin/gh/etaf/168/orig 2025-09-07T07:38:46.3678091Z * [new branch] gh/etaf/169/base -> origin/gh/etaf/169/base 2025-09-07T07:38:46.3678521Z * [new branch] gh/etaf/169/head -> origin/gh/etaf/169/head 2025-09-07T07:38:46.3679122Z * [new branch] gh/etaf/169/orig -> origin/gh/etaf/169/orig 2025-09-07T07:38:46.3680021Z * [new branch] gh/exclamaforte/1/base -> origin/gh/exclamaforte/1/base 2025-09-07T07:38:46.3680434Z * [new branch] gh/exclamaforte/1/head -> origin/gh/exclamaforte/1/head 2025-09-07T07:38:46.3681169Z * [new branch] gh/exclamaforte/2/base -> origin/gh/exclamaforte/2/base 2025-09-07T07:38:46.3681598Z * [new branch] gh/exclamaforte/2/head -> origin/gh/exclamaforte/2/head 2025-09-07T07:38:46.3682424Z * [new branch] gh/exclamaforte/3/base -> origin/gh/exclamaforte/3/base 2025-09-07T07:38:46.3682827Z * [new branch] gh/exclamaforte/3/head -> origin/gh/exclamaforte/3/head 2025-09-07T07:38:46.3683684Z * [new branch] gh/exclamaforte/4/base -> origin/gh/exclamaforte/4/base 2025-09-07T07:38:46.3684073Z * [new branch] gh/exclamaforte/4/head -> origin/gh/exclamaforte/4/head 2025-09-07T07:38:46.3685102Z * [new branch] gh/ezyang/2374/base -> origin/gh/ezyang/2374/base 2025-09-07T07:38:46.3685576Z * [new branch] gh/ezyang/2374/head -> origin/gh/ezyang/2374/head 2025-09-07T07:38:46.3686278Z * [new branch] gh/ezyang/2374/orig -> origin/gh/ezyang/2374/orig 2025-09-07T07:38:46.3686972Z * [new branch] gh/ezyang/2973/base -> origin/gh/ezyang/2973/base 2025-09-07T07:38:46.3687422Z * [new branch] gh/ezyang/2973/head -> origin/gh/ezyang/2973/head 2025-09-07T07:38:46.3688065Z * [new branch] gh/ezyang/2973/orig -> origin/gh/ezyang/2973/orig 2025-09-07T07:38:46.3688731Z * [new branch] gh/ezyang/2974/base -> origin/gh/ezyang/2974/base 2025-09-07T07:38:46.3689122Z * [new branch] gh/ezyang/2974/head -> origin/gh/ezyang/2974/head 2025-09-07T07:38:46.3689760Z * [new branch] gh/ezyang/2974/orig -> origin/gh/ezyang/2974/orig 2025-09-07T07:38:46.3690481Z * [new branch] gh/ezyang/3074/base -> origin/gh/ezyang/3074/base 2025-09-07T07:38:46.3690887Z * [new branch] gh/ezyang/3074/head -> origin/gh/ezyang/3074/head 2025-09-07T07:38:46.3691553Z * [new branch] gh/ezyang/3074/orig -> origin/gh/ezyang/3074/orig 2025-09-07T07:38:46.3692189Z * [new branch] gh/ezyang/3088/base -> origin/gh/ezyang/3088/base 2025-09-07T07:38:46.3692607Z * [new branch] gh/ezyang/3088/head -> origin/gh/ezyang/3088/head 2025-09-07T07:38:46.3693206Z * [new branch] gh/ezyang/3088/orig -> origin/gh/ezyang/3088/orig 2025-09-07T07:38:46.3693918Z * [new branch] gh/ezyang/3092/base -> origin/gh/ezyang/3092/base 2025-09-07T07:38:46.3694494Z * [new branch] gh/ezyang/3092/head -> origin/gh/ezyang/3092/head 2025-09-07T07:38:46.3694951Z * [new branch] gh/ezyang/3092/orig -> origin/gh/ezyang/3092/orig 2025-09-07T07:38:46.3695854Z * [new branch] gh/ezyang/3103/base -> origin/gh/ezyang/3103/base 2025-09-07T07:38:46.3696265Z * [new branch] gh/ezyang/3103/head -> origin/gh/ezyang/3103/head 2025-09-07T07:38:46.3696907Z * [new branch] gh/ezyang/3103/orig -> origin/gh/ezyang/3103/orig 2025-09-07T07:38:46.3697697Z * [new branch] gh/ezyang/3105/base -> origin/gh/ezyang/3105/base 2025-09-07T07:38:46.3698021Z * [new branch] gh/ezyang/3105/head -> origin/gh/ezyang/3105/head 2025-09-07T07:38:46.3698675Z * [new branch] gh/ezyang/3105/orig -> origin/gh/ezyang/3105/orig 2025-09-07T07:38:46.3699359Z * [new branch] gh/ezyang/3114/base -> origin/gh/ezyang/3114/base 2025-09-07T07:38:46.3699824Z * [new branch] gh/ezyang/3114/head -> origin/gh/ezyang/3114/head 2025-09-07T07:38:46.3700403Z * [new branch] gh/ezyang/3114/orig -> origin/gh/ezyang/3114/orig 2025-09-07T07:38:46.3701089Z * [new branch] gh/ezyang/3116/base -> origin/gh/ezyang/3116/base 2025-09-07T07:38:46.3701529Z * [new branch] gh/ezyang/3116/head -> origin/gh/ezyang/3116/head 2025-09-07T07:38:46.3702116Z * [new branch] gh/ezyang/3116/orig -> origin/gh/ezyang/3116/orig 2025-09-07T07:38:46.3702783Z * [new branch] gh/ezyang/3120/base -> origin/gh/ezyang/3120/base 2025-09-07T07:38:46.3703218Z * [new branch] gh/ezyang/3120/head -> origin/gh/ezyang/3120/head 2025-09-07T07:38:46.3703794Z * [new branch] gh/ezyang/3120/orig -> origin/gh/ezyang/3120/orig 2025-09-07T07:38:46.3704764Z * [new branch] gh/ezyang/3122/base -> origin/gh/ezyang/3122/base 2025-09-07T07:38:46.3705177Z * [new branch] gh/ezyang/3122/head -> origin/gh/ezyang/3122/head 2025-09-07T07:38:46.3705815Z * [new branch] gh/ezyang/3122/orig -> origin/gh/ezyang/3122/orig 2025-09-07T07:38:46.3706451Z * [new branch] gh/ezyang/3123/base -> origin/gh/ezyang/3123/base 2025-09-07T07:38:46.3706871Z * [new branch] gh/ezyang/3123/head -> origin/gh/ezyang/3123/head 2025-09-07T07:38:46.3707549Z * [new branch] gh/ezyang/3123/orig -> origin/gh/ezyang/3123/orig 2025-09-07T07:38:46.3708167Z * [new branch] gh/ezyang/3125/base -> origin/gh/ezyang/3125/base 2025-09-07T07:38:46.3708579Z * [new branch] gh/ezyang/3125/head -> origin/gh/ezyang/3125/head 2025-09-07T07:38:46.3709160Z * [new branch] gh/ezyang/3125/orig -> origin/gh/ezyang/3125/orig 2025-09-07T07:38:46.3709845Z * [new branch] gh/ezyang/3126/base -> origin/gh/ezyang/3126/base 2025-09-07T07:38:46.3710249Z * [new branch] gh/ezyang/3126/head -> origin/gh/ezyang/3126/head 2025-09-07T07:38:46.3710835Z * [new branch] gh/ezyang/3126/orig -> origin/gh/ezyang/3126/orig 2025-09-07T07:38:46.3711835Z * [new branch] gh/ezyang/3127/base -> origin/gh/ezyang/3127/base 2025-09-07T07:38:46.3712302Z * [new branch] gh/ezyang/3127/head -> origin/gh/ezyang/3127/head 2025-09-07T07:38:46.3712995Z * [new branch] gh/ezyang/3127/orig -> origin/gh/ezyang/3127/orig 2025-09-07T07:38:46.3713755Z * [new branch] gh/ezyang/3128/base -> origin/gh/ezyang/3128/base 2025-09-07T07:38:46.3714207Z * [new branch] gh/ezyang/3128/head -> origin/gh/ezyang/3128/head 2025-09-07T07:38:46.3746652Z * [new branch] gh/ezyang/3128/orig -> origin/gh/ezyang/3128/orig 2025-09-07T07:38:46.3746917Z * [new branch] gh/ezyang/3129/base -> origin/gh/ezyang/3129/base 2025-09-07T07:38:46.3747067Z * [new branch] gh/ezyang/3129/head -> origin/gh/ezyang/3129/head 2025-09-07T07:38:46.3747198Z * [new branch] gh/ezyang/3129/orig -> origin/gh/ezyang/3129/orig 2025-09-07T07:38:46.3747416Z * [new branch] gh/ezyang/3130/base -> origin/gh/ezyang/3130/base 2025-09-07T07:38:46.3747555Z * [new branch] gh/ezyang/3130/head -> origin/gh/ezyang/3130/head 2025-09-07T07:38:46.3747675Z * [new branch] gh/ezyang/3130/orig -> origin/gh/ezyang/3130/orig 2025-09-07T07:38:46.3747806Z * [new branch] gh/ezyang/3131/base -> origin/gh/ezyang/3131/base 2025-09-07T07:38:46.3747982Z * [new branch] gh/ezyang/3131/head -> origin/gh/ezyang/3131/head 2025-09-07T07:38:46.3748116Z * [new branch] gh/ezyang/3131/orig -> origin/gh/ezyang/3131/orig 2025-09-07T07:38:46.3748239Z * [new branch] gh/ezyang/3132/base -> origin/gh/ezyang/3132/base 2025-09-07T07:38:46.3748357Z * [new branch] gh/ezyang/3132/head -> origin/gh/ezyang/3132/head 2025-09-07T07:38:46.3748483Z * [new branch] gh/ezyang/3132/orig -> origin/gh/ezyang/3132/orig 2025-09-07T07:38:46.3748604Z * [new branch] gh/ezyang/3133/base -> origin/gh/ezyang/3133/base 2025-09-07T07:38:46.3748731Z * [new branch] gh/ezyang/3133/head -> origin/gh/ezyang/3133/head 2025-09-07T07:38:46.3748852Z * [new branch] gh/ezyang/3133/orig -> origin/gh/ezyang/3133/orig 2025-09-07T07:38:46.3748977Z * [new branch] gh/ezyang/3134/base -> origin/gh/ezyang/3134/base 2025-09-07T07:38:46.3749095Z * [new branch] gh/ezyang/3134/head -> origin/gh/ezyang/3134/head 2025-09-07T07:38:46.3749213Z * [new branch] gh/ezyang/3134/orig -> origin/gh/ezyang/3134/orig 2025-09-07T07:38:46.3749342Z * [new branch] gh/ezyang/3135/base -> origin/gh/ezyang/3135/base 2025-09-07T07:38:46.3749463Z * [new branch] gh/ezyang/3135/head -> origin/gh/ezyang/3135/head 2025-09-07T07:38:46.3749590Z * [new branch] gh/ezyang/3135/orig -> origin/gh/ezyang/3135/orig 2025-09-07T07:38:46.3749710Z * [new branch] gh/ezyang/3136/base -> origin/gh/ezyang/3136/base 2025-09-07T07:38:46.3749863Z * [new branch] gh/ezyang/3136/head -> origin/gh/ezyang/3136/head 2025-09-07T07:38:46.3749994Z * [new branch] gh/ezyang/3136/orig -> origin/gh/ezyang/3136/orig 2025-09-07T07:38:46.3750115Z * [new branch] gh/ezyang/3137/base -> origin/gh/ezyang/3137/base 2025-09-07T07:38:46.3750239Z * [new branch] gh/ezyang/3137/head -> origin/gh/ezyang/3137/head 2025-09-07T07:38:46.3750355Z * [new branch] gh/ezyang/3137/orig -> origin/gh/ezyang/3137/orig 2025-09-07T07:38:46.3750481Z * [new branch] gh/ezyang/3138/base -> origin/gh/ezyang/3138/base 2025-09-07T07:38:46.3750599Z * [new branch] gh/ezyang/3138/head -> origin/gh/ezyang/3138/head 2025-09-07T07:38:46.3750717Z * [new branch] gh/ezyang/3138/orig -> origin/gh/ezyang/3138/orig 2025-09-07T07:38:46.3750852Z * [new branch] gh/ezyang/3139/base -> origin/gh/ezyang/3139/base 2025-09-07T07:38:46.3750975Z * [new branch] gh/ezyang/3139/head -> origin/gh/ezyang/3139/head 2025-09-07T07:38:46.3751105Z * [new branch] gh/ezyang/3139/orig -> origin/gh/ezyang/3139/orig 2025-09-07T07:38:46.3751219Z * [new branch] gh/ezyang/3140/base -> origin/gh/ezyang/3140/base 2025-09-07T07:38:46.3751347Z * [new branch] gh/ezyang/3140/head -> origin/gh/ezyang/3140/head 2025-09-07T07:38:46.3751467Z * [new branch] gh/ezyang/3140/orig -> origin/gh/ezyang/3140/orig 2025-09-07T07:38:46.3751584Z * [new branch] gh/ezyang/3141/base -> origin/gh/ezyang/3141/base 2025-09-07T07:38:46.3751711Z * [new branch] gh/ezyang/3141/head -> origin/gh/ezyang/3141/head 2025-09-07T07:38:46.3751827Z * [new branch] gh/ezyang/3141/orig -> origin/gh/ezyang/3141/orig 2025-09-07T07:38:46.3751998Z * [new branch] gh/ezyang/3142/base -> origin/gh/ezyang/3142/base 2025-09-07T07:38:46.3752120Z * [new branch] gh/ezyang/3142/head -> origin/gh/ezyang/3142/head 2025-09-07T07:38:46.3752239Z * [new branch] gh/ezyang/3142/orig -> origin/gh/ezyang/3142/orig 2025-09-07T07:38:46.3752364Z * [new branch] gh/ezyang/3143/base -> origin/gh/ezyang/3143/base 2025-09-07T07:38:46.3752480Z * [new branch] gh/ezyang/3143/head -> origin/gh/ezyang/3143/head 2025-09-07T07:38:46.3752602Z * [new branch] gh/ezyang/3143/orig -> origin/gh/ezyang/3143/orig 2025-09-07T07:38:46.3752730Z * [new branch] gh/fadara01/1/base -> origin/gh/fadara01/1/base 2025-09-07T07:38:46.3752855Z * [new branch] gh/fadara01/1/head -> origin/gh/fadara01/1/head 2025-09-07T07:38:46.3752974Z * [new branch] gh/fadara01/1/orig -> origin/gh/fadara01/1/orig 2025-09-07T07:38:46.3753107Z * [new branch] gh/fduwjj/171/base -> origin/gh/fduwjj/171/base 2025-09-07T07:38:46.3753229Z * [new branch] gh/fduwjj/171/head -> origin/gh/fduwjj/171/head 2025-09-07T07:38:46.3753346Z * [new branch] gh/fduwjj/171/orig -> origin/gh/fduwjj/171/orig 2025-09-07T07:38:46.3753469Z * [new branch] gh/fduwjj/175/base -> origin/gh/fduwjj/175/base 2025-09-07T07:38:46.3753596Z * [new branch] gh/fduwjj/175/head -> origin/gh/fduwjj/175/head 2025-09-07T07:38:46.3753717Z * [new branch] gh/fduwjj/175/orig -> origin/gh/fduwjj/175/orig 2025-09-07T07:38:46.3753832Z * [new branch] gh/fduwjj/176/base -> origin/gh/fduwjj/176/base 2025-09-07T07:38:46.3753959Z * [new branch] gh/fduwjj/176/head -> origin/gh/fduwjj/176/head 2025-09-07T07:38:46.3754079Z * [new branch] gh/fduwjj/176/orig -> origin/gh/fduwjj/176/orig 2025-09-07T07:38:46.3754192Z * [new branch] gh/fduwjj/177/base -> origin/gh/fduwjj/177/base 2025-09-07T07:38:46.3754360Z * [new branch] gh/fduwjj/177/head -> origin/gh/fduwjj/177/head 2025-09-07T07:38:46.3754473Z * [new branch] gh/fduwjj/177/orig -> origin/gh/fduwjj/177/orig 2025-09-07T07:38:46.3754599Z * [new branch] gh/fduwjj/178/base -> origin/gh/fduwjj/178/base 2025-09-07T07:38:46.3754715Z * [new branch] gh/fduwjj/178/head -> origin/gh/fduwjj/178/head 2025-09-07T07:38:46.3754858Z * [new branch] gh/fduwjj/178/orig -> origin/gh/fduwjj/178/orig 2025-09-07T07:38:46.3755657Z * [new branch] gh/fduwjj/179/base -> origin/gh/fduwjj/179/base 2025-09-07T07:38:46.3756032Z * [new branch] gh/fduwjj/179/head -> origin/gh/fduwjj/179/head 2025-09-07T07:38:46.3756844Z * [new branch] gh/fduwjj/179/orig -> origin/gh/fduwjj/179/orig 2025-09-07T07:38:46.3757386Z * [new branch] gh/fduwjj/180/base -> origin/gh/fduwjj/180/base 2025-09-07T07:38:46.3757816Z * [new branch] gh/fduwjj/180/head -> origin/gh/fduwjj/180/head 2025-09-07T07:38:46.3758446Z * [new branch] gh/fduwjj/180/orig -> origin/gh/fduwjj/180/orig 2025-09-07T07:38:46.3759115Z * [new branch] gh/fduwjj/181/base -> origin/gh/fduwjj/181/base 2025-09-07T07:38:46.3759923Z * [new branch] gh/fduwjj/181/head -> origin/gh/fduwjj/181/head 2025-09-07T07:38:46.3760351Z * [new branch] gh/fduwjj/181/orig -> origin/gh/fduwjj/181/orig 2025-09-07T07:38:46.3761131Z * [new branch] gh/fduwjj/182/base -> origin/gh/fduwjj/182/base 2025-09-07T07:38:46.3761580Z * [new branch] gh/fduwjj/182/head -> origin/gh/fduwjj/182/head 2025-09-07T07:38:46.3762066Z * [new branch] gh/fduwjj/182/orig -> origin/gh/fduwjj/182/orig 2025-09-07T07:38:46.3762956Z * [new branch] gh/fduwjj/183/base -> origin/gh/fduwjj/183/base 2025-09-07T07:38:46.3763738Z * [new branch] gh/fduwjj/183/head -> origin/gh/fduwjj/183/head 2025-09-07T07:38:46.3764179Z * [new branch] gh/fduwjj/183/orig -> origin/gh/fduwjj/183/orig 2025-09-07T07:38:46.3765240Z * [new branch] gh/fduwjj/184/base -> origin/gh/fduwjj/184/base 2025-09-07T07:38:46.3765657Z * [new branch] gh/fduwjj/184/head -> origin/gh/fduwjj/184/head 2025-09-07T07:38:46.3766221Z * [new branch] gh/fduwjj/184/orig -> origin/gh/fduwjj/184/orig 2025-09-07T07:38:46.3767023Z * [new branch] gh/fduwjj/185/base -> origin/gh/fduwjj/185/base 2025-09-07T07:38:46.3767471Z * [new branch] gh/fduwjj/185/head -> origin/gh/fduwjj/185/head 2025-09-07T07:38:46.3768111Z * [new branch] gh/fduwjj/185/orig -> origin/gh/fduwjj/185/orig 2025-09-07T07:38:46.3768760Z * [new branch] gh/fduwjj/186/base -> origin/gh/fduwjj/186/base 2025-09-07T07:38:46.3769333Z * [new branch] gh/fduwjj/186/head -> origin/gh/fduwjj/186/head 2025-09-07T07:38:46.3769777Z * [new branch] gh/fduwjj/186/orig -> origin/gh/fduwjj/186/orig 2025-09-07T07:38:46.3770502Z * [new branch] gh/fduwjj/187/base -> origin/gh/fduwjj/187/base 2025-09-07T07:38:46.3771396Z * [new branch] gh/fduwjj/187/head -> origin/gh/fduwjj/187/head 2025-09-07T07:38:46.3771988Z * [new branch] gh/fduwjj/187/orig -> origin/gh/fduwjj/187/orig 2025-09-07T07:38:46.3772591Z * [new branch] gh/fduwjj/188/base -> origin/gh/fduwjj/188/base 2025-09-07T07:38:46.3773009Z * [new branch] gh/fduwjj/188/head -> origin/gh/fduwjj/188/head 2025-09-07T07:38:46.3773456Z * [new branch] gh/fduwjj/188/orig -> origin/gh/fduwjj/188/orig 2025-09-07T07:38:46.3774249Z * [new branch] gh/fduwjj/189/base -> origin/gh/fduwjj/189/base 2025-09-07T07:38:46.3774564Z * [new branch] gh/fduwjj/189/head -> origin/gh/fduwjj/189/head 2025-09-07T07:38:46.3775002Z * [new branch] gh/fduwjj/189/orig -> origin/gh/fduwjj/189/orig 2025-09-07T07:38:46.3775946Z * [new branch] gh/fduwjj/190/base -> origin/gh/fduwjj/190/base 2025-09-07T07:38:46.3776413Z * [new branch] gh/fduwjj/190/head -> origin/gh/fduwjj/190/head 2025-09-07T07:38:46.3776976Z * [new branch] gh/fduwjj/190/orig -> origin/gh/fduwjj/190/orig 2025-09-07T07:38:46.3777640Z * [new branch] gh/fduwjj/191/base -> origin/gh/fduwjj/191/base 2025-09-07T07:38:46.3778457Z * [new branch] gh/fduwjj/191/head -> origin/gh/fduwjj/191/head 2025-09-07T07:38:46.3778916Z * [new branch] gh/fduwjj/191/orig -> origin/gh/fduwjj/191/orig 2025-09-07T07:38:46.3779897Z * [new branch] gh/fegin/306/base -> origin/gh/fegin/306/base 2025-09-07T07:38:46.3780305Z * [new branch] gh/fegin/306/head -> origin/gh/fegin/306/head 2025-09-07T07:38:46.3780991Z * [new branch] gh/fegin/306/orig -> origin/gh/fegin/306/orig 2025-09-07T07:38:46.3781674Z * [new branch] gh/fegin/307/base -> origin/gh/fegin/307/base 2025-09-07T07:38:46.3782095Z * [new branch] gh/fegin/307/head -> origin/gh/fegin/307/head 2025-09-07T07:38:46.3782706Z * [new branch] gh/fegin/307/orig -> origin/gh/fegin/307/orig 2025-09-07T07:38:46.3783385Z * [new branch] gh/fegin/308/base -> origin/gh/fegin/308/base 2025-09-07T07:38:46.3783816Z * [new branch] gh/fegin/308/head -> origin/gh/fegin/308/head 2025-09-07T07:38:46.3784376Z * [new branch] gh/fegin/308/orig -> origin/gh/fegin/308/orig 2025-09-07T07:38:46.3785255Z * [new branch] gh/fegin/309/base -> origin/gh/fegin/309/base 2025-09-07T07:38:46.3785675Z * [new branch] gh/fegin/309/head -> origin/gh/fegin/309/head 2025-09-07T07:38:46.3786324Z * [new branch] gh/fegin/309/orig -> origin/gh/fegin/309/orig 2025-09-07T07:38:46.3787011Z * [new branch] gh/fegin/310/base -> origin/gh/fegin/310/base 2025-09-07T07:38:46.3787666Z * [new branch] gh/fegin/310/head -> origin/gh/fegin/310/head 2025-09-07T07:38:46.3788284Z * [new branch] gh/fegin/310/orig -> origin/gh/fegin/310/orig 2025-09-07T07:38:46.3788976Z * [new branch] gh/fegin/311/base -> origin/gh/fegin/311/base 2025-09-07T07:38:46.3789428Z * [new branch] gh/fegin/311/head -> origin/gh/fegin/311/head 2025-09-07T07:38:46.3790049Z * [new branch] gh/fegin/311/orig -> origin/gh/fegin/311/orig 2025-09-07T07:38:46.3790688Z * [new branch] gh/fegin/312/base -> origin/gh/fegin/312/base 2025-09-07T07:38:46.3791330Z * [new branch] gh/fegin/312/head -> origin/gh/fegin/312/head 2025-09-07T07:38:46.3791601Z * [new branch] gh/fegin/312/orig -> origin/gh/fegin/312/orig 2025-09-07T07:38:46.3792495Z * [new branch] gh/fegin/313/base -> origin/gh/fegin/313/base 2025-09-07T07:38:46.3793329Z * [new branch] gh/fegin/313/head -> origin/gh/fegin/313/head 2025-09-07T07:38:46.3793773Z * [new branch] gh/fegin/313/orig -> origin/gh/fegin/313/orig 2025-09-07T07:38:46.3794778Z * [new branch] gh/fffrog/124/base -> origin/gh/fffrog/124/base 2025-09-07T07:38:46.3795234Z * [new branch] gh/fffrog/124/head -> origin/gh/fffrog/124/head 2025-09-07T07:38:46.3795866Z * [new branch] gh/fffrog/124/orig -> origin/gh/fffrog/124/orig 2025-09-07T07:38:46.3797221Z * [new branch] gh/fffrog/129/base -> origin/gh/fffrog/129/base 2025-09-07T07:38:46.3797851Z * [new branch] gh/fffrog/129/head -> origin/gh/fffrog/129/head 2025-09-07T07:38:46.3798295Z * [new branch] gh/fffrog/129/orig -> origin/gh/fffrog/129/orig 2025-09-07T07:38:46.3799082Z * [new branch] gh/fffrog/130/base -> origin/gh/fffrog/130/base 2025-09-07T07:38:46.3799532Z * [new branch] gh/fffrog/130/head -> origin/gh/fffrog/130/head 2025-09-07T07:38:46.3800152Z * [new branch] gh/fffrog/130/orig -> origin/gh/fffrog/130/orig 2025-09-07T07:38:46.3800985Z * [new branch] gh/fffrog/131/base -> origin/gh/fffrog/131/base 2025-09-07T07:38:46.3801379Z * [new branch] gh/fffrog/131/head -> origin/gh/fffrog/131/head 2025-09-07T07:38:46.3802051Z * [new branch] gh/fffrog/131/orig -> origin/gh/fffrog/131/orig 2025-09-07T07:38:46.3802957Z * [new branch] gh/fffrog/132/base -> origin/gh/fffrog/132/base 2025-09-07T07:38:46.3804498Z * [new branch] gh/fffrog/132/head -> origin/gh/fffrog/132/head 2025-09-07T07:38:46.3804620Z * [new branch] gh/fffrog/132/orig -> origin/gh/fffrog/132/orig 2025-09-07T07:38:46.3804740Z * [new branch] gh/fffrog/133/base -> origin/gh/fffrog/133/base 2025-09-07T07:38:46.3805194Z * [new branch] gh/fffrog/133/head -> origin/gh/fffrog/133/head 2025-09-07T07:38:46.3805753Z * [new branch] gh/fffrog/133/orig -> origin/gh/fffrog/133/orig 2025-09-07T07:38:46.3806563Z * [new branch] gh/fffrog/134/base -> origin/gh/fffrog/134/base 2025-09-07T07:38:46.3807015Z * [new branch] gh/fffrog/134/head -> origin/gh/fffrog/134/head 2025-09-07T07:38:46.3807620Z * [new branch] gh/fffrog/134/orig -> origin/gh/fffrog/134/orig 2025-09-07T07:38:46.3808405Z * [new branch] gh/fffrog/135/base -> origin/gh/fffrog/135/base 2025-09-07T07:38:46.3808776Z * [new branch] gh/fffrog/135/head -> origin/gh/fffrog/135/head 2025-09-07T07:38:46.3809390Z * [new branch] gh/fffrog/135/orig -> origin/gh/fffrog/135/orig 2025-09-07T07:38:46.3810121Z * [new branch] gh/fffrog/136/base -> origin/gh/fffrog/136/base 2025-09-07T07:38:46.3810521Z * [new branch] gh/fffrog/136/head -> origin/gh/fffrog/136/head 2025-09-07T07:38:46.3811123Z * [new branch] gh/fffrog/136/orig -> origin/gh/fffrog/136/orig 2025-09-07T07:38:46.3811756Z * [new branch] gh/fffrog/137/base -> origin/gh/fffrog/137/base 2025-09-07T07:38:46.3812151Z * [new branch] gh/fffrog/137/head -> origin/gh/fffrog/137/head 2025-09-07T07:38:46.3812768Z * [new branch] gh/fffrog/137/orig -> origin/gh/fffrog/137/orig 2025-09-07T07:38:46.3813434Z * [new branch] gh/fffrog/138/base -> origin/gh/fffrog/138/base 2025-09-07T07:38:46.3813873Z * [new branch] gh/fffrog/138/head -> origin/gh/fffrog/138/head 2025-09-07T07:38:46.3814475Z * [new branch] gh/fffrog/138/orig -> origin/gh/fffrog/138/orig 2025-09-07T07:38:46.3816661Z * [new branch] gh/fffrog/139/base -> origin/gh/fffrog/139/base 2025-09-07T07:38:46.3817834Z * [new branch] gh/fffrog/139/head -> origin/gh/fffrog/139/head 2025-09-07T07:38:46.3817968Z * [new branch] gh/fffrog/139/orig -> origin/gh/fffrog/139/orig 2025-09-07T07:38:46.3818088Z * [new branch] gh/fffrog/140/base -> origin/gh/fffrog/140/base 2025-09-07T07:38:46.3818217Z * [new branch] gh/fffrog/140/head -> origin/gh/fffrog/140/head 2025-09-07T07:38:46.3818332Z * [new branch] gh/fffrog/140/orig -> origin/gh/fffrog/140/orig 2025-09-07T07:38:46.3818686Z * [new branch] gh/fffrog/141/base -> origin/gh/fffrog/141/base 2025-09-07T07:38:46.3819141Z * [new branch] gh/fffrog/141/head -> origin/gh/fffrog/141/head 2025-09-07T07:38:46.3819605Z * [new branch] gh/fffrog/141/orig -> origin/gh/fffrog/141/orig 2025-09-07T07:38:46.3820407Z * [new branch] gh/fffrog/142/base -> origin/gh/fffrog/142/base 2025-09-07T07:38:46.3820818Z * [new branch] gh/fffrog/142/head -> origin/gh/fffrog/142/head 2025-09-07T07:38:46.3821456Z * [new branch] gh/fffrog/142/orig -> origin/gh/fffrog/142/orig 2025-09-07T07:38:46.3822175Z * [new branch] gh/fffrog/143/base -> origin/gh/fffrog/143/base 2025-09-07T07:38:46.3822583Z * [new branch] gh/fffrog/143/head -> origin/gh/fffrog/143/head 2025-09-07T07:38:46.3823222Z * [new branch] gh/fffrog/143/orig -> origin/gh/fffrog/143/orig 2025-09-07T07:38:46.3824284Z * [new branch] gh/fffrog/144/base -> origin/gh/fffrog/144/base 2025-09-07T07:38:46.3825090Z * [new branch] gh/fffrog/144/head -> origin/gh/fffrog/144/head 2025-09-07T07:38:46.3825496Z * [new branch] gh/fffrog/144/orig -> origin/gh/fffrog/144/orig 2025-09-07T07:38:46.3826327Z * [new branch] gh/fffrog/145/base -> origin/gh/fffrog/145/base 2025-09-07T07:38:46.3826723Z * [new branch] gh/fffrog/145/head -> origin/gh/fffrog/145/head 2025-09-07T07:38:46.3827375Z * [new branch] gh/fffrog/145/orig -> origin/gh/fffrog/145/orig 2025-09-07T07:38:46.3828054Z * [new branch] gh/fffrog/146/base -> origin/gh/fffrog/146/base 2025-09-07T07:38:46.3828485Z * [new branch] gh/fffrog/146/head -> origin/gh/fffrog/146/head 2025-09-07T07:38:46.3829188Z * [new branch] gh/fffrog/146/orig -> origin/gh/fffrog/146/orig 2025-09-07T07:38:46.3829858Z * [new branch] gh/fffrog/147/base -> origin/gh/fffrog/147/base 2025-09-07T07:38:46.3830258Z * [new branch] gh/fffrog/147/head -> origin/gh/fffrog/147/head 2025-09-07T07:38:46.3830891Z * [new branch] gh/fffrog/147/orig -> origin/gh/fffrog/147/orig 2025-09-07T07:38:46.3831572Z * [new branch] gh/fffrog/148/base -> origin/gh/fffrog/148/base 2025-09-07T07:38:46.3832022Z * [new branch] gh/fffrog/148/head -> origin/gh/fffrog/148/head 2025-09-07T07:38:46.3832593Z * [new branch] gh/fffrog/148/orig -> origin/gh/fffrog/148/orig 2025-09-07T07:38:46.3833425Z * [new branch] gh/fffrog/149/base -> origin/gh/fffrog/149/base 2025-09-07T07:38:46.3833896Z * [new branch] gh/fffrog/149/head -> origin/gh/fffrog/149/head 2025-09-07T07:38:46.3834539Z * [new branch] gh/fffrog/149/orig -> origin/gh/fffrog/149/orig 2025-09-07T07:38:46.3835234Z * [new branch] gh/fffrog/150/base -> origin/gh/fffrog/150/base 2025-09-07T07:38:46.3835630Z * [new branch] gh/fffrog/150/head -> origin/gh/fffrog/150/head 2025-09-07T07:38:46.3836263Z * [new branch] gh/fffrog/150/orig -> origin/gh/fffrog/150/orig 2025-09-07T07:38:46.3837018Z * [new branch] gh/fffrog/151/base -> origin/gh/fffrog/151/base 2025-09-07T07:38:46.3837595Z * [new branch] gh/fffrog/151/head -> origin/gh/fffrog/151/head 2025-09-07T07:38:46.3837961Z * [new branch] gh/fffrog/151/orig -> origin/gh/fffrog/151/orig 2025-09-07T07:38:46.3838765Z * [new branch] gh/fffrog/152/base -> origin/gh/fffrog/152/base 2025-09-07T07:38:46.3839175Z * [new branch] gh/fffrog/152/head -> origin/gh/fffrog/152/head 2025-09-07T07:38:46.3840011Z * [new branch] gh/fffrog/153/base -> origin/gh/fffrog/153/base 2025-09-07T07:38:46.3840403Z * [new branch] gh/fffrog/153/head -> origin/gh/fffrog/153/head 2025-09-07T07:38:46.3841027Z * [new branch] gh/fffrog/153/orig -> origin/gh/fffrog/153/orig 2025-09-07T07:38:46.3841917Z * [new branch] gh/gmagogsfm/1/base -> origin/gh/gmagogsfm/1/base 2025-09-07T07:38:46.3842532Z * [new branch] gh/gmagogsfm/1/head -> origin/gh/gmagogsfm/1/head 2025-09-07T07:38:46.3842982Z * [new branch] gh/gmagogsfm/1/orig -> origin/gh/gmagogsfm/1/orig 2025-09-07T07:38:46.3843739Z * [new branch] gh/gmagogsfm/2/base -> origin/gh/gmagogsfm/2/base 2025-09-07T07:38:46.3844127Z * [new branch] gh/gmagogsfm/2/head -> origin/gh/gmagogsfm/2/head 2025-09-07T07:38:46.3844697Z * [new branch] gh/gmagogsfm/2/orig -> origin/gh/gmagogsfm/2/orig 2025-09-07T07:38:46.3845342Z * [new branch] gh/gmagogsfm/3/base -> origin/gh/gmagogsfm/3/base 2025-09-07T07:38:46.3845924Z * [new branch] gh/gmagogsfm/3/head -> origin/gh/gmagogsfm/3/head 2025-09-07T07:38:46.3846350Z * [new branch] gh/gmagogsfm/3/orig -> origin/gh/gmagogsfm/3/orig 2025-09-07T07:38:46.3847343Z * [new branch] gh/guangyey/134/base -> origin/gh/guangyey/134/base 2025-09-07T07:38:46.3847742Z * [new branch] gh/guangyey/134/head -> origin/gh/guangyey/134/head 2025-09-07T07:38:46.3848409Z * [new branch] gh/guangyey/134/orig -> origin/gh/guangyey/134/orig 2025-09-07T07:38:46.3849048Z * [new branch] gh/guangyey/135/base -> origin/gh/guangyey/135/base 2025-09-07T07:38:46.3849505Z * [new branch] gh/guangyey/135/head -> origin/gh/guangyey/135/head 2025-09-07T07:38:46.3850178Z * [new branch] gh/guangyey/135/orig -> origin/gh/guangyey/135/orig 2025-09-07T07:38:46.3850974Z * [new branch] gh/guangyey/139/base -> origin/gh/guangyey/139/base 2025-09-07T07:38:46.3851399Z * [new branch] gh/guangyey/139/head -> origin/gh/guangyey/139/head 2025-09-07T07:38:46.3851972Z * [new branch] gh/guangyey/139/orig -> origin/gh/guangyey/139/orig 2025-09-07T07:38:46.3852654Z * [new branch] gh/guangyey/140/base -> origin/gh/guangyey/140/base 2025-09-07T07:38:46.3853064Z * [new branch] gh/guangyey/140/head -> origin/gh/guangyey/140/head 2025-09-07T07:38:46.3853644Z * [new branch] gh/guangyey/140/orig -> origin/gh/guangyey/140/orig 2025-09-07T07:38:46.3854314Z * [new branch] gh/guangyey/142/base -> origin/gh/guangyey/142/base 2025-09-07T07:38:46.3854712Z * [new branch] gh/guangyey/142/head -> origin/gh/guangyey/142/head 2025-09-07T07:38:46.3855329Z * [new branch] gh/guangyey/142/orig -> origin/gh/guangyey/142/orig 2025-09-07T07:38:46.3856002Z * [new branch] gh/guangyey/145/base -> origin/gh/guangyey/145/base 2025-09-07T07:38:46.3856399Z * [new branch] gh/guangyey/145/head -> origin/gh/guangyey/145/head 2025-09-07T07:38:46.3857099Z * [new branch] gh/guangyey/145/orig -> origin/gh/guangyey/145/orig 2025-09-07T07:38:46.3858094Z * [new branch] gh/guangyey/153/base -> origin/gh/guangyey/153/base 2025-09-07T07:38:46.3858915Z * [new branch] gh/guangyey/153/head -> origin/gh/guangyey/153/head 2025-09-07T07:38:46.3859041Z * [new branch] gh/guangyey/153/orig -> origin/gh/guangyey/153/orig 2025-09-07T07:38:46.3859942Z * [new branch] gh/guangyey/159/base -> origin/gh/guangyey/159/base 2025-09-07T07:38:46.3860359Z * [new branch] gh/guangyey/159/head -> origin/gh/guangyey/159/head 2025-09-07T07:38:46.3861007Z * [new branch] gh/guangyey/159/orig -> origin/gh/guangyey/159/orig 2025-09-07T07:38:46.3861712Z * [new branch] gh/guangyey/163/base -> origin/gh/guangyey/163/base 2025-09-07T07:38:46.3862111Z * [new branch] gh/guangyey/163/head -> origin/gh/guangyey/163/head 2025-09-07T07:38:46.3862683Z * [new branch] gh/guangyey/163/orig -> origin/gh/guangyey/163/orig 2025-09-07T07:38:46.3863730Z * [new branch] gh/guangyey/168/base -> origin/gh/guangyey/168/base 2025-09-07T07:38:46.3864151Z * [new branch] gh/guangyey/168/head -> origin/gh/guangyey/168/head 2025-09-07T07:38:46.3864796Z * [new branch] gh/guangyey/168/orig -> origin/gh/guangyey/168/orig 2025-09-07T07:38:46.3865497Z * [new branch] gh/guangyey/169/base -> origin/gh/guangyey/169/base 2025-09-07T07:38:46.3865895Z * [new branch] gh/guangyey/169/head -> origin/gh/guangyey/169/head 2025-09-07T07:38:46.3866469Z * [new branch] gh/guangyey/169/orig -> origin/gh/guangyey/169/orig 2025-09-07T07:38:46.3867188Z * [new branch] gh/guangyey/170/base -> origin/gh/guangyey/170/base 2025-09-07T07:38:46.3867616Z * [new branch] gh/guangyey/170/head -> origin/gh/guangyey/170/head 2025-09-07T07:38:46.3868202Z * [new branch] gh/guangyey/170/orig -> origin/gh/guangyey/170/orig 2025-09-07T07:38:46.3869013Z * [new branch] gh/guangyey/171/base -> origin/gh/guangyey/171/base 2025-09-07T07:38:46.3869426Z * [new branch] gh/guangyey/171/head -> origin/gh/guangyey/171/head 2025-09-07T07:38:46.3869994Z * [new branch] gh/guangyey/171/orig -> origin/gh/guangyey/171/orig 2025-09-07T07:38:46.3870698Z * [new branch] gh/guangyey/174/base -> origin/gh/guangyey/174/base 2025-09-07T07:38:46.3871106Z * [new branch] gh/guangyey/174/head -> origin/gh/guangyey/174/head 2025-09-07T07:38:46.3871757Z * [new branch] gh/guangyey/174/orig -> origin/gh/guangyey/174/orig 2025-09-07T07:38:46.3872483Z * [new branch] gh/guangyey/176/base -> origin/gh/guangyey/176/base 2025-09-07T07:38:46.3872863Z * [new branch] gh/guangyey/176/head -> origin/gh/guangyey/176/head 2025-09-07T07:38:46.3873432Z * [new branch] gh/guangyey/176/orig -> origin/gh/guangyey/176/orig 2025-09-07T07:38:46.3874133Z * [new branch] gh/guangyey/178/base -> origin/gh/guangyey/178/base 2025-09-07T07:38:46.3874969Z * [new branch] gh/guangyey/178/head -> origin/gh/guangyey/178/head 2025-09-07T07:38:46.3875438Z * [new branch] gh/guangyey/178/orig -> origin/gh/guangyey/178/orig 2025-09-07T07:38:46.3876445Z * [new branch] gh/guangyey/181/base -> origin/gh/guangyey/181/base 2025-09-07T07:38:46.3876867Z * [new branch] gh/guangyey/181/head -> origin/gh/guangyey/181/head 2025-09-07T07:38:46.3877451Z * [new branch] gh/guangyey/181/orig -> origin/gh/guangyey/181/orig 2025-09-07T07:38:46.3878278Z * [new branch] gh/guangyey/182/base -> origin/gh/guangyey/182/base 2025-09-07T07:38:46.3878741Z * [new branch] gh/guangyey/182/head -> origin/gh/guangyey/182/head 2025-09-07T07:38:46.3879351Z * [new branch] gh/guangyey/182/orig -> origin/gh/guangyey/182/orig 2025-09-07T07:38:46.3880020Z * [new branch] gh/guangyey/183/base -> origin/gh/guangyey/183/base 2025-09-07T07:38:46.3880431Z * [new branch] gh/guangyey/183/head -> origin/gh/guangyey/183/head 2025-09-07T07:38:46.3881060Z * [new branch] gh/guangyey/183/orig -> origin/gh/guangyey/183/orig 2025-09-07T07:38:46.3881877Z * [new branch] gh/guangyey/184/base -> origin/gh/guangyey/184/base 2025-09-07T07:38:46.3882311Z * [new branch] gh/guangyey/184/head -> origin/gh/guangyey/184/head 2025-09-07T07:38:46.3882883Z * [new branch] gh/guangyey/184/orig -> origin/gh/guangyey/184/orig 2025-09-07T07:38:46.3883580Z * [new branch] gh/guangyey/185/base -> origin/gh/guangyey/185/base 2025-09-07T07:38:46.3884022Z * [new branch] gh/guangyey/185/head -> origin/gh/guangyey/185/head 2025-09-07T07:38:46.3884590Z * [new branch] gh/guangyey/185/orig -> origin/gh/guangyey/185/orig 2025-09-07T07:38:46.3885343Z * [new branch] gh/guangyey/186/base -> origin/gh/guangyey/186/base 2025-09-07T07:38:46.3885743Z * [new branch] gh/guangyey/186/head -> origin/gh/guangyey/186/head 2025-09-07T07:38:46.3886363Z * [new branch] gh/guangyey/186/orig -> origin/gh/guangyey/186/orig 2025-09-07T07:38:46.3887147Z * [new branch] gh/guangyey/187/base -> origin/gh/guangyey/187/base 2025-09-07T07:38:46.3887602Z * [new branch] gh/guangyey/187/head -> origin/gh/guangyey/187/head 2025-09-07T07:38:46.3888173Z * [new branch] gh/guangyey/187/orig -> origin/gh/guangyey/187/orig 2025-09-07T07:38:46.3888893Z * [new branch] gh/guangyey/188/base -> origin/gh/guangyey/188/base 2025-09-07T07:38:46.3889320Z * [new branch] gh/guangyey/188/head -> origin/gh/guangyey/188/head 2025-09-07T07:38:46.3889917Z * [new branch] gh/guangyey/188/orig -> origin/gh/guangyey/188/orig 2025-09-07T07:38:46.3890647Z * [new branch] gh/guangyey/189/base -> origin/gh/guangyey/189/base 2025-09-07T07:38:46.3891035Z * [new branch] gh/guangyey/189/head -> origin/gh/guangyey/189/head 2025-09-07T07:38:46.3891613Z * [new branch] gh/guangyey/189/orig -> origin/gh/guangyey/189/orig 2025-09-07T07:38:46.3892310Z * [new branch] gh/guangyey/190/base -> origin/gh/guangyey/190/base 2025-09-07T07:38:46.3892784Z * [new branch] gh/guangyey/190/head -> origin/gh/guangyey/190/head 2025-09-07T07:38:46.3893366Z * [new branch] gh/guangyey/190/orig -> origin/gh/guangyey/190/orig 2025-09-07T07:38:46.3894090Z * [new branch] gh/guangyey/191/base -> origin/gh/guangyey/191/base 2025-09-07T07:38:46.3894496Z * [new branch] gh/guangyey/191/head -> origin/gh/guangyey/191/head 2025-09-07T07:38:46.3895192Z * [new branch] gh/guangyey/191/orig -> origin/gh/guangyey/191/orig 2025-09-07T07:38:46.3895932Z * [new branch] gh/guangyey/192/base -> origin/gh/guangyey/192/base 2025-09-07T07:38:46.3896358Z * [new branch] gh/guangyey/192/head -> origin/gh/guangyey/192/head 2025-09-07T07:38:46.3896984Z * [new branch] gh/guangyey/192/orig -> origin/gh/guangyey/192/orig 2025-09-07T07:38:46.3897706Z * [new branch] gh/guangyey/193/base -> origin/gh/guangyey/193/base 2025-09-07T07:38:46.3898117Z * [new branch] gh/guangyey/193/head -> origin/gh/guangyey/193/head 2025-09-07T07:38:46.3898761Z * [new branch] gh/guangyey/193/orig -> origin/gh/guangyey/193/orig 2025-09-07T07:38:46.3899503Z * [new branch] gh/guangyey/194/base -> origin/gh/guangyey/194/base 2025-09-07T07:38:46.3899895Z * [new branch] gh/guangyey/194/head -> origin/gh/guangyey/194/head 2025-09-07T07:38:46.3900544Z * [new branch] gh/guangyey/194/orig -> origin/gh/guangyey/194/orig 2025-09-07T07:38:46.3901267Z * [new branch] gh/guangyey/195/base -> origin/gh/guangyey/195/base 2025-09-07T07:38:46.3901833Z * [new branch] gh/guangyey/195/head -> origin/gh/guangyey/195/head 2025-09-07T07:38:46.3902257Z * [new branch] gh/guangyey/195/orig -> origin/gh/guangyey/195/orig 2025-09-07T07:38:46.3903082Z * [new branch] gh/guangyey/196/base -> origin/gh/guangyey/196/base 2025-09-07T07:38:46.3903509Z * [new branch] gh/guangyey/196/head -> origin/gh/guangyey/196/head 2025-09-07T07:38:46.3904195Z * [new branch] gh/guangyey/196/orig -> origin/gh/guangyey/196/orig 2025-09-07T07:38:46.3904921Z * [new branch] gh/guangyey/197/base -> origin/gh/guangyey/197/base 2025-09-07T07:38:46.3905350Z * [new branch] gh/guangyey/197/head -> origin/gh/guangyey/197/head 2025-09-07T07:38:46.3905927Z * [new branch] gh/guangyey/197/orig -> origin/gh/guangyey/197/orig 2025-09-07T07:38:46.3906609Z * [new branch] gh/guangyey/198/base -> origin/gh/guangyey/198/base 2025-09-07T07:38:46.3907066Z * [new branch] gh/guangyey/198/head -> origin/gh/guangyey/198/head 2025-09-07T07:38:46.3907697Z * [new branch] gh/guangyey/198/orig -> origin/gh/guangyey/198/orig 2025-09-07T07:38:46.3908456Z * [new branch] gh/guangyey/199/base -> origin/gh/guangyey/199/base 2025-09-07T07:38:46.3908854Z * [new branch] gh/guangyey/199/head -> origin/gh/guangyey/199/head 2025-09-07T07:38:46.3909472Z * [new branch] gh/guangyey/199/orig -> origin/gh/guangyey/199/orig 2025-09-07T07:38:46.3910188Z * [new branch] gh/guangyey/200/base -> origin/gh/guangyey/200/base 2025-09-07T07:38:46.3910611Z * [new branch] gh/guangyey/200/head -> origin/gh/guangyey/200/head 2025-09-07T07:38:46.3911137Z * [new branch] gh/guangyey/200/orig -> origin/gh/guangyey/200/orig 2025-09-07T07:38:46.3911892Z * [new branch] gh/guangyey/201/base -> origin/gh/guangyey/201/base 2025-09-07T07:38:46.3912348Z * [new branch] gh/guangyey/201/head -> origin/gh/guangyey/201/head 2025-09-07T07:38:46.3913023Z * [new branch] gh/guangyey/201/orig -> origin/gh/guangyey/201/orig 2025-09-07T07:38:46.3913756Z * [new branch] gh/guangyey/202/base -> origin/gh/guangyey/202/base 2025-09-07T07:38:46.3914159Z * [new branch] gh/guangyey/202/head -> origin/gh/guangyey/202/head 2025-09-07T07:38:46.3914793Z * [new branch] gh/guangyey/202/orig -> origin/gh/guangyey/202/orig 2025-09-07T07:38:46.3915470Z * [new branch] gh/guangyey/203/base -> origin/gh/guangyey/203/base 2025-09-07T07:38:46.3915906Z * [new branch] gh/guangyey/203/head -> origin/gh/guangyey/203/head 2025-09-07T07:38:46.3916532Z * [new branch] gh/guangyey/203/orig -> origin/gh/guangyey/203/orig 2025-09-07T07:38:46.3917215Z * [new branch] gh/guangyey/204/base -> origin/gh/guangyey/204/base 2025-09-07T07:38:46.3917623Z * [new branch] gh/guangyey/204/head -> origin/gh/guangyey/204/head 2025-09-07T07:38:46.3918264Z * [new branch] gh/guangyey/204/orig -> origin/gh/guangyey/204/orig 2025-09-07T07:38:46.3918950Z * [new branch] gh/guangyey/205/base -> origin/gh/guangyey/205/base 2025-09-07T07:38:46.3919356Z * [new branch] gh/guangyey/205/head -> origin/gh/guangyey/205/head 2025-09-07T07:38:46.3919931Z * [new branch] gh/guangyey/205/orig -> origin/gh/guangyey/205/orig 2025-09-07T07:38:46.3920730Z * [new branch] gh/guangyey/206/base -> origin/gh/guangyey/206/base 2025-09-07T07:38:46.3921149Z * [new branch] gh/guangyey/206/head -> origin/gh/guangyey/206/head 2025-09-07T07:38:46.3921820Z * [new branch] gh/guangyey/206/orig -> origin/gh/guangyey/206/orig 2025-09-07T07:38:46.3922534Z * [new branch] gh/guangyey/207/base -> origin/gh/guangyey/207/base 2025-09-07T07:38:46.3922930Z * [new branch] gh/guangyey/207/head -> origin/gh/guangyey/207/head 2025-09-07T07:38:46.3923815Z * [new branch] gh/guangyey/207/orig -> origin/gh/guangyey/207/orig 2025-09-07T07:38:46.3924179Z * [new branch] gh/guangyey/79/base -> origin/gh/guangyey/79/base 2025-09-07T07:38:46.3924761Z * [new branch] gh/guangyey/79/head -> origin/gh/guangyey/79/head 2025-09-07T07:38:46.3925178Z * [new branch] gh/guangyey/79/orig -> origin/gh/guangyey/79/orig 2025-09-07T07:38:46.3925994Z * [new branch] gh/guangyey/89/base -> origin/gh/guangyey/89/base 2025-09-07T07:38:46.3926389Z * [new branch] gh/guangyey/89/head -> origin/gh/guangyey/89/head 2025-09-07T07:38:46.3926958Z * [new branch] gh/guangyey/89/orig -> origin/gh/guangyey/89/orig 2025-09-07T07:38:46.3927953Z * [new branch] gh/guilhermeleobas/107/base -> origin/gh/guilhermeleobas/107/base 2025-09-07T07:38:46.3928410Z * [new branch] gh/guilhermeleobas/107/head -> origin/gh/guilhermeleobas/107/head 2025-09-07T07:38:46.3929179Z * [new branch] gh/guilhermeleobas/107/orig -> origin/gh/guilhermeleobas/107/orig 2025-09-07T07:38:46.3929664Z * [new branch] gh/guilhermeleobas/108/base -> origin/gh/guilhermeleobas/108/base 2025-09-07T07:38:46.3930271Z * [new branch] gh/guilhermeleobas/108/head -> origin/gh/guilhermeleobas/108/head 2025-09-07T07:38:46.3930873Z * [new branch] gh/guilhermeleobas/108/orig -> origin/gh/guilhermeleobas/108/orig 2025-09-07T07:38:46.3931578Z * [new branch] gh/guilhermeleobas/124/base -> origin/gh/guilhermeleobas/124/base 2025-09-07T07:38:46.3932006Z * [new branch] gh/guilhermeleobas/124/head -> origin/gh/guilhermeleobas/124/head 2025-09-07T07:38:46.3932652Z * [new branch] gh/guilhermeleobas/124/orig -> origin/gh/guilhermeleobas/124/orig 2025-09-07T07:38:46.3933391Z * [new branch] gh/guilhermeleobas/147/base -> origin/gh/guilhermeleobas/147/base 2025-09-07T07:38:46.3933820Z * [new branch] gh/guilhermeleobas/147/head -> origin/gh/guilhermeleobas/147/head 2025-09-07T07:38:46.3934452Z * [new branch] gh/guilhermeleobas/147/orig -> origin/gh/guilhermeleobas/147/orig 2025-09-07T07:38:46.3935160Z * [new branch] gh/guilhermeleobas/150/base -> origin/gh/guilhermeleobas/150/base 2025-09-07T07:38:46.3935657Z * [new branch] gh/guilhermeleobas/150/head -> origin/gh/guilhermeleobas/150/head 2025-09-07T07:38:46.3936246Z * [new branch] gh/guilhermeleobas/150/orig -> origin/gh/guilhermeleobas/150/orig 2025-09-07T07:38:46.3937003Z * [new branch] gh/guilhermeleobas/163/base -> origin/gh/guilhermeleobas/163/base 2025-09-07T07:38:46.3937402Z * [new branch] gh/guilhermeleobas/163/head -> origin/gh/guilhermeleobas/163/head 2025-09-07T07:38:46.3938002Z * [new branch] gh/guilhermeleobas/163/orig -> origin/gh/guilhermeleobas/163/orig 2025-09-07T07:38:46.3938684Z * [new branch] gh/guilhermeleobas/164/base -> origin/gh/guilhermeleobas/164/base 2025-09-07T07:38:46.3939375Z * [new branch] gh/guilhermeleobas/164/head -> origin/gh/guilhermeleobas/164/head 2025-09-07T07:38:46.3939786Z * [new branch] gh/guilhermeleobas/164/orig -> origin/gh/guilhermeleobas/164/orig 2025-09-07T07:38:46.3940561Z * [new branch] gh/guilhermeleobas/165/base -> origin/gh/guilhermeleobas/165/base 2025-09-07T07:38:46.3940976Z * [new branch] gh/guilhermeleobas/165/head -> origin/gh/guilhermeleobas/165/head 2025-09-07T07:38:46.3941577Z * [new branch] gh/guilhermeleobas/165/orig -> origin/gh/guilhermeleobas/165/orig 2025-09-07T07:38:46.3942232Z * [new branch] gh/guilhermeleobas/166/base -> origin/gh/guilhermeleobas/166/base 2025-09-07T07:38:46.3942711Z * [new branch] gh/guilhermeleobas/166/head -> origin/gh/guilhermeleobas/166/head 2025-09-07T07:38:46.3943353Z * [new branch] gh/guilhermeleobas/166/orig -> origin/gh/guilhermeleobas/166/orig 2025-09-07T07:38:46.3944050Z * [new branch] gh/guilhermeleobas/167/base -> origin/gh/guilhermeleobas/167/base 2025-09-07T07:38:46.3944453Z * [new branch] gh/guilhermeleobas/167/head -> origin/gh/guilhermeleobas/167/head 2025-09-07T07:38:46.3945049Z * [new branch] gh/guilhermeleobas/167/orig -> origin/gh/guilhermeleobas/167/orig 2025-09-07T07:38:46.3945718Z * [new branch] gh/guilhermeleobas/168/base -> origin/gh/guilhermeleobas/168/base 2025-09-07T07:38:46.3946629Z * [new branch] gh/guilhermeleobas/168/head -> origin/gh/guilhermeleobas/168/head 2025-09-07T07:38:46.3947080Z * [new branch] gh/guilhermeleobas/168/orig -> origin/gh/guilhermeleobas/168/orig 2025-09-07T07:38:46.3947838Z * [new branch] gh/guilhermeleobas/169/base -> origin/gh/guilhermeleobas/169/base 2025-09-07T07:38:46.3948445Z * [new branch] gh/guilhermeleobas/169/head -> origin/gh/guilhermeleobas/169/head 2025-09-07T07:38:46.3948872Z * [new branch] gh/guilhermeleobas/169/orig -> origin/gh/guilhermeleobas/169/orig 2025-09-07T07:38:46.3949630Z * [new branch] gh/guilhermeleobas/170/base -> origin/gh/guilhermeleobas/170/base 2025-09-07T07:38:46.3950112Z * [new branch] gh/guilhermeleobas/170/head -> origin/gh/guilhermeleobas/170/head 2025-09-07T07:38:46.3950703Z * [new branch] gh/guilhermeleobas/170/orig -> origin/gh/guilhermeleobas/170/orig 2025-09-07T07:38:46.3951400Z * [new branch] gh/guilhermeleobas/171/base -> origin/gh/guilhermeleobas/171/base 2025-09-07T07:38:46.3952041Z * [new branch] gh/guilhermeleobas/171/head -> origin/gh/guilhermeleobas/171/head 2025-09-07T07:38:46.3952486Z * [new branch] gh/guilhermeleobas/171/orig -> origin/gh/guilhermeleobas/171/orig 2025-09-07T07:38:46.3953256Z * [new branch] gh/guilhermeleobas/173/base -> origin/gh/guilhermeleobas/173/base 2025-09-07T07:38:46.3953692Z * [new branch] gh/guilhermeleobas/173/head -> origin/gh/guilhermeleobas/173/head 2025-09-07T07:38:46.3954291Z * [new branch] gh/guilhermeleobas/173/orig -> origin/gh/guilhermeleobas/173/orig 2025-09-07T07:38:46.3954950Z * [new branch] gh/guilhermeleobas/192/base -> origin/gh/guilhermeleobas/192/base 2025-09-07T07:38:46.3955379Z * [new branch] gh/guilhermeleobas/192/head -> origin/gh/guilhermeleobas/192/head 2025-09-07T07:38:46.3955980Z * [new branch] gh/guilhermeleobas/192/orig -> origin/gh/guilhermeleobas/192/orig 2025-09-07T07:38:46.3956938Z * [new branch] gh/guilhermeleobas/193/base -> origin/gh/guilhermeleobas/193/base 2025-09-07T07:38:46.3957616Z * [new branch] gh/guilhermeleobas/193/head -> origin/gh/guilhermeleobas/193/head 2025-09-07T07:38:46.3958242Z * [new branch] gh/guilhermeleobas/193/orig -> origin/gh/guilhermeleobas/193/orig 2025-09-07T07:38:46.3958926Z * [new branch] gh/guilhermeleobas/194/base -> origin/gh/guilhermeleobas/194/base 2025-09-07T07:38:46.3959344Z * [new branch] gh/guilhermeleobas/194/head -> origin/gh/guilhermeleobas/194/head 2025-09-07T07:38:46.3959931Z * [new branch] gh/guilhermeleobas/194/orig -> origin/gh/guilhermeleobas/194/orig 2025-09-07T07:38:46.3960804Z * [new branch] gh/guilhermeleobas/203/base -> origin/gh/guilhermeleobas/203/base 2025-09-07T07:38:46.3961068Z * [new branch] gh/guilhermeleobas/203/head -> origin/gh/guilhermeleobas/203/head 2025-09-07T07:38:46.3961661Z * [new branch] gh/guilhermeleobas/203/orig -> origin/gh/guilhermeleobas/203/orig 2025-09-07T07:38:46.3962304Z * [new branch] gh/guilhermeleobas/204/base -> origin/gh/guilhermeleobas/204/base 2025-09-07T07:38:46.3962726Z * [new branch] gh/guilhermeleobas/204/head -> origin/gh/guilhermeleobas/204/head 2025-09-07T07:38:46.3963301Z * [new branch] gh/guilhermeleobas/204/orig -> origin/gh/guilhermeleobas/204/orig 2025-09-07T07:38:46.3964038Z * [new branch] gh/guilhermeleobas/205/base -> origin/gh/guilhermeleobas/205/base 2025-09-07T07:38:46.3964690Z * [new branch] gh/guilhermeleobas/205/head -> origin/gh/guilhermeleobas/205/head 2025-09-07T07:38:46.3965031Z * [new branch] gh/guilhermeleobas/205/orig -> origin/gh/guilhermeleobas/205/orig 2025-09-07T07:38:46.3965848Z * [new branch] gh/guilhermeleobas/209/base -> origin/gh/guilhermeleobas/209/base 2025-09-07T07:38:46.3966476Z * [new branch] gh/guilhermeleobas/209/head -> origin/gh/guilhermeleobas/209/head 2025-09-07T07:38:46.3966927Z * [new branch] gh/guilhermeleobas/209/orig -> origin/gh/guilhermeleobas/209/orig 2025-09-07T07:38:46.3967709Z * [new branch] gh/guilhermeleobas/210/base -> origin/gh/guilhermeleobas/210/base 2025-09-07T07:38:46.3968122Z * [new branch] gh/guilhermeleobas/210/head -> origin/gh/guilhermeleobas/210/head 2025-09-07T07:38:46.3968719Z * [new branch] gh/guilhermeleobas/210/orig -> origin/gh/guilhermeleobas/210/orig 2025-09-07T07:38:46.3969458Z * [new branch] gh/guilhermeleobas/211/base -> origin/gh/guilhermeleobas/211/base 2025-09-07T07:38:46.3969883Z * [new branch] gh/guilhermeleobas/211/head -> origin/gh/guilhermeleobas/211/head 2025-09-07T07:38:46.3970469Z * [new branch] gh/guilhermeleobas/211/orig -> origin/gh/guilhermeleobas/211/orig 2025-09-07T07:38:46.3971582Z * [new branch] gh/guilhermeleobas/214/base -> origin/gh/guilhermeleobas/214/base 2025-09-07T07:38:46.3972104Z * [new branch] gh/guilhermeleobas/214/head -> origin/gh/guilhermeleobas/214/head 2025-09-07T07:38:46.3972594Z * [new branch] gh/guilhermeleobas/214/orig -> origin/gh/guilhermeleobas/214/orig 2025-09-07T07:38:46.3973362Z * [new branch] gh/guilhermeleobas/215/base -> origin/gh/guilhermeleobas/215/base 2025-09-07T07:38:46.3973804Z * [new branch] gh/guilhermeleobas/215/head -> origin/gh/guilhermeleobas/215/head 2025-09-07T07:38:46.3974395Z * [new branch] gh/guilhermeleobas/215/orig -> origin/gh/guilhermeleobas/215/orig 2025-09-07T07:38:46.3975107Z * [new branch] gh/guilhermeleobas/216/base -> origin/gh/guilhermeleobas/216/base 2025-09-07T07:38:46.3975737Z * [new branch] gh/guilhermeleobas/216/head -> origin/gh/guilhermeleobas/216/head 2025-09-07T07:38:46.3976181Z * [new branch] gh/guilhermeleobas/216/orig -> origin/gh/guilhermeleobas/216/orig 2025-09-07T07:38:46.3976966Z * [new branch] gh/guilhermeleobas/217/base -> origin/gh/guilhermeleobas/217/base 2025-09-07T07:38:46.3977373Z * [new branch] gh/guilhermeleobas/217/head -> origin/gh/guilhermeleobas/217/head 2025-09-07T07:38:46.3977971Z * [new branch] gh/guilhermeleobas/217/orig -> origin/gh/guilhermeleobas/217/orig 2025-09-07T07:38:46.3978979Z * [new branch] gh/guilhermeleobas/219/base -> origin/gh/guilhermeleobas/219/base 2025-09-07T07:38:46.3979530Z * [new branch] gh/guilhermeleobas/219/head -> origin/gh/guilhermeleobas/219/head 2025-09-07T07:38:46.3979998Z * [new branch] gh/guilhermeleobas/219/orig -> origin/gh/guilhermeleobas/219/orig 2025-09-07T07:38:46.3980769Z * [new branch] gh/guilhermeleobas/220/base -> origin/gh/guilhermeleobas/220/base 2025-09-07T07:38:46.3981169Z * [new branch] gh/guilhermeleobas/220/head -> origin/gh/guilhermeleobas/220/head 2025-09-07T07:38:46.3986198Z * [new branch] gh/guilhermeleobas/220/orig -> origin/gh/guilhermeleobas/220/orig 2025-09-07T07:38:46.3986948Z * [new branch] gh/guilhermeleobas/221/base -> origin/gh/guilhermeleobas/221/base 2025-09-07T07:38:46.3987378Z * [new branch] gh/guilhermeleobas/221/head -> origin/gh/guilhermeleobas/221/head 2025-09-07T07:38:46.3987984Z * [new branch] gh/guilhermeleobas/221/orig -> origin/gh/guilhermeleobas/221/orig 2025-09-07T07:38:46.3989033Z * [new branch] gh/guilhermeleobas/222/base -> origin/gh/guilhermeleobas/222/base 2025-09-07T07:38:46.3989403Z * [new branch] gh/guilhermeleobas/222/head -> origin/gh/guilhermeleobas/222/head 2025-09-07T07:38:46.3989987Z * [new branch] gh/guilhermeleobas/222/orig -> origin/gh/guilhermeleobas/222/orig 2025-09-07T07:38:46.3990721Z * [new branch] gh/guilhermeleobas/223/base -> origin/gh/guilhermeleobas/223/base 2025-09-07T07:38:46.3991330Z * [new branch] gh/guilhermeleobas/223/head -> origin/gh/guilhermeleobas/223/head 2025-09-07T07:38:46.3991812Z * [new branch] gh/guilhermeleobas/223/orig -> origin/gh/guilhermeleobas/223/orig 2025-09-07T07:38:46.3992622Z * [new branch] gh/guilhermeleobas/224/base -> origin/gh/guilhermeleobas/224/base 2025-09-07T07:38:46.3993085Z * [new branch] gh/guilhermeleobas/224/head -> origin/gh/guilhermeleobas/224/head 2025-09-07T07:38:46.3993560Z * [new branch] gh/guilhermeleobas/224/orig -> origin/gh/guilhermeleobas/224/orig 2025-09-07T07:38:46.3994348Z * [new branch] gh/guilhermeleobas/225/base -> origin/gh/guilhermeleobas/225/base 2025-09-07T07:38:46.3994779Z * [new branch] gh/guilhermeleobas/225/head -> origin/gh/guilhermeleobas/225/head 2025-09-07T07:38:46.3995365Z * [new branch] gh/guilhermeleobas/225/orig -> origin/gh/guilhermeleobas/225/orig 2025-09-07T07:38:46.3996047Z * [new branch] gh/guilhermeleobas/226/base -> origin/gh/guilhermeleobas/226/base 2025-09-07T07:38:46.3996459Z * [new branch] gh/guilhermeleobas/226/head -> origin/gh/guilhermeleobas/226/head 2025-09-07T07:38:46.3997059Z * [new branch] gh/guilhermeleobas/226/orig -> origin/gh/guilhermeleobas/226/orig 2025-09-07T07:38:46.3997874Z * [new branch] gh/guilhermeleobas/227/base -> origin/gh/guilhermeleobas/227/base 2025-09-07T07:38:46.3998355Z * [new branch] gh/guilhermeleobas/227/head -> origin/gh/guilhermeleobas/227/head 2025-09-07T07:38:46.3998956Z * [new branch] gh/guilhermeleobas/227/orig -> origin/gh/guilhermeleobas/227/orig 2025-09-07T07:38:46.4000033Z * [new branch] gh/guilhermeleobas/228/base -> origin/gh/guilhermeleobas/228/base 2025-09-07T07:38:46.4000416Z * [new branch] gh/guilhermeleobas/228/head -> origin/gh/guilhermeleobas/228/head 2025-09-07T07:38:46.4000864Z * [new branch] gh/guilhermeleobas/228/orig -> origin/gh/guilhermeleobas/228/orig 2025-09-07T07:38:46.4001638Z * [new branch] gh/guilhermeleobas/229/base -> origin/gh/guilhermeleobas/229/base 2025-09-07T07:38:46.4002080Z * [new branch] gh/guilhermeleobas/229/head -> origin/gh/guilhermeleobas/229/head 2025-09-07T07:38:46.4002704Z * [new branch] gh/guilhermeleobas/229/orig -> origin/gh/guilhermeleobas/229/orig 2025-09-07T07:38:46.4003399Z * [new branch] gh/guilhermeleobas/230/base -> origin/gh/guilhermeleobas/230/base 2025-09-07T07:38:46.4003824Z * [new branch] gh/guilhermeleobas/230/head -> origin/gh/guilhermeleobas/230/head 2025-09-07T07:38:46.4004415Z * [new branch] gh/guilhermeleobas/230/orig -> origin/gh/guilhermeleobas/230/orig 2025-09-07T07:38:46.4005108Z * [new branch] gh/guilhermeleobas/231/base -> origin/gh/guilhermeleobas/231/base 2025-09-07T07:38:46.4005686Z * [new branch] gh/guilhermeleobas/231/head -> origin/gh/guilhermeleobas/231/head 2025-09-07T07:38:46.4006115Z * [new branch] gh/guilhermeleobas/231/orig -> origin/gh/guilhermeleobas/231/orig 2025-09-07T07:38:46.4007542Z * [new branch] gh/guilhermeleobas/232/base -> origin/gh/guilhermeleobas/232/base 2025-09-07T07:38:46.4007735Z * [new branch] gh/guilhermeleobas/232/head -> origin/gh/guilhermeleobas/232/head 2025-09-07T07:38:46.4008098Z * [new branch] gh/guilhermeleobas/232/orig -> origin/gh/guilhermeleobas/232/orig 2025-09-07T07:38:46.4008987Z * [new branch] gh/guilhermeleobas/233/base -> origin/gh/guilhermeleobas/233/base 2025-09-07T07:38:46.4009230Z * [new branch] gh/guilhermeleobas/233/head -> origin/gh/guilhermeleobas/233/head 2025-09-07T07:38:46.4009853Z * [new branch] gh/guilhermeleobas/233/orig -> origin/gh/guilhermeleobas/233/orig 2025-09-07T07:38:46.4010576Z * [new branch] gh/guilhermeleobas/234/base -> origin/gh/guilhermeleobas/234/base 2025-09-07T07:38:46.4011036Z * [new branch] gh/guilhermeleobas/234/head -> origin/gh/guilhermeleobas/234/head 2025-09-07T07:38:46.4011585Z * [new branch] gh/guilhermeleobas/234/orig -> origin/gh/guilhermeleobas/234/orig 2025-09-07T07:38:46.4012277Z * [new branch] gh/guilhermeleobas/235/base -> origin/gh/guilhermeleobas/235/base 2025-09-07T07:38:46.4012835Z * [new branch] gh/guilhermeleobas/235/head -> origin/gh/guilhermeleobas/235/head 2025-09-07T07:38:46.4013366Z * [new branch] gh/guilhermeleobas/235/orig -> origin/gh/guilhermeleobas/235/orig 2025-09-07T07:38:46.4014125Z * [new branch] gh/guilhermeleobas/236/base -> origin/gh/guilhermeleobas/236/base 2025-09-07T07:38:46.4014535Z * [new branch] gh/guilhermeleobas/236/head -> origin/gh/guilhermeleobas/236/head 2025-09-07T07:38:46.4015166Z * [new branch] gh/guilhermeleobas/236/orig -> origin/gh/guilhermeleobas/236/orig 2025-09-07T07:38:46.4015963Z * [new branch] gh/guilhermeleobas/237/base -> origin/gh/guilhermeleobas/237/base 2025-09-07T07:38:46.4016739Z * [new branch] gh/guilhermeleobas/237/head -> origin/gh/guilhermeleobas/237/head 2025-09-07T07:38:46.4017179Z * [new branch] gh/guilhermeleobas/237/orig -> origin/gh/guilhermeleobas/237/orig 2025-09-07T07:38:46.4017990Z * [new branch] gh/guilhermeleobas/238/base -> origin/gh/guilhermeleobas/238/base 2025-09-07T07:38:46.4018459Z * [new branch] gh/guilhermeleobas/238/head -> origin/gh/guilhermeleobas/238/head 2025-09-07T07:38:46.4019049Z * [new branch] gh/guilhermeleobas/238/orig -> origin/gh/guilhermeleobas/238/orig 2025-09-07T07:38:46.4019765Z * [new branch] gh/guilhermeleobas/239/base -> origin/gh/guilhermeleobas/239/base 2025-09-07T07:38:46.4020257Z * [new branch] gh/guilhermeleobas/239/head -> origin/gh/guilhermeleobas/239/head 2025-09-07T07:38:46.4020860Z * [new branch] gh/guilhermeleobas/239/orig -> origin/gh/guilhermeleobas/239/orig 2025-09-07T07:38:46.4021639Z * [new branch] gh/guilhermeleobas/240/base -> origin/gh/guilhermeleobas/240/base 2025-09-07T07:38:46.4022052Z * [new branch] gh/guilhermeleobas/240/head -> origin/gh/guilhermeleobas/240/head 2025-09-07T07:38:46.4022637Z * [new branch] gh/guilhermeleobas/240/orig -> origin/gh/guilhermeleobas/240/orig 2025-09-07T07:38:46.4023378Z * [new branch] gh/guilhermeleobas/241/base -> origin/gh/guilhermeleobas/241/base 2025-09-07T07:38:46.4023964Z * [new branch] gh/guilhermeleobas/241/head -> origin/gh/guilhermeleobas/241/head 2025-09-07T07:38:46.4024561Z * [new branch] gh/guilhermeleobas/241/orig -> origin/gh/guilhermeleobas/241/orig 2025-09-07T07:38:46.4025334Z * [new branch] gh/guilhermeleobas/242/base -> origin/gh/guilhermeleobas/242/base 2025-09-07T07:38:46.4025765Z * [new branch] gh/guilhermeleobas/242/head -> origin/gh/guilhermeleobas/242/head 2025-09-07T07:38:46.4026354Z * [new branch] gh/guilhermeleobas/242/orig -> origin/gh/guilhermeleobas/242/orig 2025-09-07T07:38:46.4027008Z * [new branch] gh/guilhermeleobas/243/base -> origin/gh/guilhermeleobas/243/base 2025-09-07T07:38:46.4027613Z * [new branch] gh/guilhermeleobas/243/head -> origin/gh/guilhermeleobas/243/head 2025-09-07T07:38:46.4028048Z * [new branch] gh/guilhermeleobas/243/orig -> origin/gh/guilhermeleobas/243/orig 2025-09-07T07:38:46.4028911Z * [new branch] gh/guilhermeleobas/244/base -> origin/gh/guilhermeleobas/244/base 2025-09-07T07:38:46.4029295Z * [new branch] gh/guilhermeleobas/244/head -> origin/gh/guilhermeleobas/244/head 2025-09-07T07:38:46.4029827Z * [new branch] gh/guilhermeleobas/244/orig -> origin/gh/guilhermeleobas/244/orig 2025-09-07T07:38:46.4030569Z * [new branch] gh/guilhermeleobas/245/base -> origin/gh/guilhermeleobas/245/base 2025-09-07T07:38:46.4030994Z * [new branch] gh/guilhermeleobas/245/head -> origin/gh/guilhermeleobas/245/head 2025-09-07T07:38:46.4031578Z * [new branch] gh/guilhermeleobas/245/orig -> origin/gh/guilhermeleobas/245/orig 2025-09-07T07:38:46.4032335Z * [new branch] gh/guilhermeleobas/73/base -> origin/gh/guilhermeleobas/73/base 2025-09-07T07:38:46.4032771Z * [new branch] gh/guilhermeleobas/73/head -> origin/gh/guilhermeleobas/73/head 2025-09-07T07:38:46.4033452Z * [new branch] gh/guilhermeleobas/73/orig -> origin/gh/guilhermeleobas/73/orig 2025-09-07T07:38:46.4034477Z * [new branch] gh/henrylhtsang/140/base -> origin/gh/henrylhtsang/140/base 2025-09-07T07:38:46.4035103Z * [new branch] gh/henrylhtsang/140/head -> origin/gh/henrylhtsang/140/head 2025-09-07T07:38:46.4035472Z * [new branch] gh/henrylhtsang/140/orig -> origin/gh/henrylhtsang/140/orig 2025-09-07T07:38:46.4036221Z * [new branch] gh/henrylhtsang/141/base -> origin/gh/henrylhtsang/141/base 2025-09-07T07:38:46.4036639Z * [new branch] gh/henrylhtsang/141/head -> origin/gh/henrylhtsang/141/head 2025-09-07T07:38:46.4037231Z * [new branch] gh/henrylhtsang/141/orig -> origin/gh/henrylhtsang/141/orig 2025-09-07T07:38:46.4038159Z * [new branch] gh/henrylhtsang/142/base -> origin/gh/henrylhtsang/142/base 2025-09-07T07:38:46.4038692Z * [new branch] gh/henrylhtsang/142/head -> origin/gh/henrylhtsang/142/head 2025-09-07T07:38:46.4039287Z * [new branch] gh/henrylhtsang/142/orig -> origin/gh/henrylhtsang/142/orig 2025-09-07T07:38:46.4039995Z * [new branch] gh/henrylhtsang/143/base -> origin/gh/henrylhtsang/143/base 2025-09-07T07:38:46.4040406Z * [new branch] gh/henrylhtsang/143/head -> origin/gh/henrylhtsang/143/head 2025-09-07T07:38:46.4041002Z * [new branch] gh/henrylhtsang/143/orig -> origin/gh/henrylhtsang/143/orig 2025-09-07T07:38:46.4041811Z * [new branch] gh/henrylhtsang/144/base -> origin/gh/henrylhtsang/144/base 2025-09-07T07:38:46.4042243Z * [new branch] gh/henrylhtsang/144/head -> origin/gh/henrylhtsang/144/head 2025-09-07T07:38:46.4042976Z * [new branch] gh/henrylhtsang/144/orig -> origin/gh/henrylhtsang/144/orig 2025-09-07T07:38:46.4043737Z * [new branch] gh/henrylhtsang/145/base -> origin/gh/henrylhtsang/145/base 2025-09-07T07:38:46.4044170Z * [new branch] gh/henrylhtsang/145/head -> origin/gh/henrylhtsang/145/head 2025-09-07T07:38:46.4044740Z * [new branch] gh/henrylhtsang/145/orig -> origin/gh/henrylhtsang/145/orig 2025-09-07T07:38:46.4045579Z * [new branch] gh/henrylhtsang/146/base -> origin/gh/henrylhtsang/146/base 2025-09-07T07:38:46.4045989Z * [new branch] gh/henrylhtsang/146/head -> origin/gh/henrylhtsang/146/head 2025-09-07T07:38:46.4046593Z * [new branch] gh/henrylhtsang/146/orig -> origin/gh/henrylhtsang/146/orig 2025-09-07T07:38:46.4047277Z * [new branch] gh/henrylhtsang/147/base -> origin/gh/henrylhtsang/147/base 2025-09-07T07:38:46.4047731Z * [new branch] gh/henrylhtsang/147/head -> origin/gh/henrylhtsang/147/head 2025-09-07T07:38:46.4048362Z * [new branch] gh/henrylhtsang/147/orig -> origin/gh/henrylhtsang/147/orig 2025-09-07T07:38:46.4049285Z * [new branch] gh/henrylhtsang/148/base -> origin/gh/henrylhtsang/148/base 2025-09-07T07:38:46.4049929Z * [new branch] gh/henrylhtsang/148/head -> origin/gh/henrylhtsang/148/head 2025-09-07T07:38:46.4050322Z * [new branch] gh/henrylhtsang/148/orig -> origin/gh/henrylhtsang/148/orig 2025-09-07T07:38:46.4051124Z * [new branch] gh/henrylhtsang/149/base -> origin/gh/henrylhtsang/149/base 2025-09-07T07:38:46.4051548Z * [new branch] gh/henrylhtsang/149/head -> origin/gh/henrylhtsang/149/head 2025-09-07T07:38:46.4052275Z * [new branch] gh/henrylhtsang/149/orig -> origin/gh/henrylhtsang/149/orig 2025-09-07T07:38:46.4053189Z * [new branch] gh/huydhn/1/next -> origin/gh/huydhn/1/next 2025-09-07T07:38:46.4053807Z * [new branch] gh/huydhn/2/next -> origin/gh/huydhn/2/next 2025-09-07T07:38:46.4054518Z * [new branch] gh/huydhn/3/next -> origin/gh/huydhn/3/next 2025-09-07T07:38:46.4055231Z * [new branch] gh/huydhn/4/next -> origin/gh/huydhn/4/next 2025-09-07T07:38:46.4055976Z * [new branch] gh/huydhn/5/next -> origin/gh/huydhn/5/next 2025-09-07T07:38:46.4056658Z * [new branch] gh/huydhn/6/next -> origin/gh/huydhn/6/next 2025-09-07T07:38:46.4057563Z * [new branch] gh/int3/97/base -> origin/gh/int3/97/base 2025-09-07T07:38:46.4058389Z * [new branch] gh/int3/97/head -> origin/gh/int3/97/head 2025-09-07T07:38:46.4059262Z * [new branch] gh/isuruf/101/base -> origin/gh/isuruf/101/base 2025-09-07T07:38:46.4059683Z * [new branch] gh/isuruf/101/head -> origin/gh/isuruf/101/head 2025-09-07T07:38:46.4060458Z * [new branch] gh/isuruf/141/base -> origin/gh/isuruf/141/base 2025-09-07T07:38:46.4061322Z * [new branch] gh/isuruf/141/head -> origin/gh/isuruf/141/head 2025-09-07T07:38:46.4061761Z * [new branch] gh/isuruf/141/orig -> origin/gh/isuruf/141/orig 2025-09-07T07:38:46.4062573Z * [new branch] gh/isuruf/142/base -> origin/gh/isuruf/142/base 2025-09-07T07:38:46.4063180Z * [new branch] gh/isuruf/142/head -> origin/gh/isuruf/142/head 2025-09-07T07:38:46.4063630Z * [new branch] gh/isuruf/142/orig -> origin/gh/isuruf/142/orig 2025-09-07T07:38:46.4064426Z * [new branch] gh/isuruf/143/base -> origin/gh/isuruf/143/base 2025-09-07T07:38:46.4064821Z * [new branch] gh/isuruf/143/head -> origin/gh/isuruf/143/head 2025-09-07T07:38:46.4065425Z * [new branch] gh/isuruf/143/orig -> origin/gh/isuruf/143/orig 2025-09-07T07:38:46.4066099Z * [new branch] gh/isuruf/144/base -> origin/gh/isuruf/144/base 2025-09-07T07:38:46.4066525Z * [new branch] gh/isuruf/144/head -> origin/gh/isuruf/144/head 2025-09-07T07:38:46.4067144Z * [new branch] gh/isuruf/144/orig -> origin/gh/isuruf/144/orig 2025-09-07T07:38:46.4067827Z * [new branch] gh/isuruf/145/base -> origin/gh/isuruf/145/base 2025-09-07T07:38:46.4068253Z * [new branch] gh/isuruf/145/head -> origin/gh/isuruf/145/head 2025-09-07T07:38:46.4068823Z * [new branch] gh/isuruf/145/orig -> origin/gh/isuruf/145/orig 2025-09-07T07:38:46.4069606Z * [new branch] gh/isuruf/146/base -> origin/gh/isuruf/146/base 2025-09-07T07:38:46.4070030Z * [new branch] gh/isuruf/146/head -> origin/gh/isuruf/146/head 2025-09-07T07:38:46.4070643Z * [new branch] gh/isuruf/146/orig -> origin/gh/isuruf/146/orig 2025-09-07T07:38:46.4071424Z * [new branch] gh/isuruf/81/base -> origin/gh/isuruf/81/base 2025-09-07T07:38:46.4071845Z * [new branch] gh/isuruf/81/head -> origin/gh/isuruf/81/head 2025-09-07T07:38:46.4072468Z * [new branch] gh/isuruf/81/orig -> origin/gh/isuruf/81/orig 2025-09-07T07:38:46.4073314Z * [new branch] gh/jamesjwu/150/base -> origin/gh/jamesjwu/150/base 2025-09-07T07:38:46.4073712Z * [new branch] gh/jamesjwu/150/head -> origin/gh/jamesjwu/150/head 2025-09-07T07:38:46.4074294Z * [new branch] gh/jamesjwu/150/orig -> origin/gh/jamesjwu/150/orig 2025-09-07T07:38:46.4075098Z * [new branch] gh/jamesjwu/154/base -> origin/gh/jamesjwu/154/base 2025-09-07T07:38:46.4075850Z * [new branch] gh/jamesjwu/154/head -> origin/gh/jamesjwu/154/head 2025-09-07T07:38:46.4076421Z * [new branch] gh/jamesjwu/154/orig -> origin/gh/jamesjwu/154/orig 2025-09-07T07:38:46.4077117Z * [new branch] gh/jamesjwu/155/base -> origin/gh/jamesjwu/155/base 2025-09-07T07:38:46.4077526Z * [new branch] gh/jamesjwu/155/head -> origin/gh/jamesjwu/155/head 2025-09-07T07:38:46.4078141Z * [new branch] gh/jamesjwu/155/orig -> origin/gh/jamesjwu/155/orig 2025-09-07T07:38:46.4078850Z * [new branch] gh/jamesjwu/159/base -> origin/gh/jamesjwu/159/base 2025-09-07T07:38:46.4079280Z * [new branch] gh/jamesjwu/159/head -> origin/gh/jamesjwu/159/head 2025-09-07T07:38:46.4079895Z * [new branch] gh/jamesjwu/159/orig -> origin/gh/jamesjwu/159/orig 2025-09-07T07:38:46.4081052Z * [new branch] gh/jamesjwu/163/base -> origin/gh/jamesjwu/163/base 2025-09-07T07:38:46.4081504Z * [new branch] gh/jamesjwu/163/head -> origin/gh/jamesjwu/163/head 2025-09-07T07:38:46.4082114Z * [new branch] gh/jamesjwu/163/orig -> origin/gh/jamesjwu/163/orig 2025-09-07T07:38:46.4082805Z * [new branch] gh/jamesjwu/171/base -> origin/gh/jamesjwu/171/base 2025-09-07T07:38:46.4083222Z * [new branch] gh/jamesjwu/171/head -> origin/gh/jamesjwu/171/head 2025-09-07T07:38:46.4083875Z * [new branch] gh/jamesjwu/171/orig -> origin/gh/jamesjwu/171/orig 2025-09-07T07:38:46.4084613Z * [new branch] gh/jamesjwu/176/base -> origin/gh/jamesjwu/176/base 2025-09-07T07:38:46.4085006Z * [new branch] gh/jamesjwu/176/head -> origin/gh/jamesjwu/176/head 2025-09-07T07:38:46.4085587Z * [new branch] gh/jamesjwu/176/orig -> origin/gh/jamesjwu/176/orig 2025-09-07T07:38:46.4086296Z * [new branch] gh/jamesjwu/181/base -> origin/gh/jamesjwu/181/base 2025-09-07T07:38:46.4086719Z * [new branch] gh/jamesjwu/181/head -> origin/gh/jamesjwu/181/head 2025-09-07T07:38:46.4087326Z * [new branch] gh/jamesjwu/181/orig -> origin/gh/jamesjwu/181/orig 2025-09-07T07:38:46.4088005Z * [new branch] gh/jamesjwu/182/base -> origin/gh/jamesjwu/182/base 2025-09-07T07:38:46.4088782Z * [new branch] gh/jamesjwu/182/head -> origin/gh/jamesjwu/182/head 2025-09-07T07:38:46.4089192Z * [new branch] gh/jamesjwu/182/orig -> origin/gh/jamesjwu/182/orig 2025-09-07T07:38:46.4090132Z * [new branch] gh/jamesjwu/183/base -> origin/gh/jamesjwu/183/base 2025-09-07T07:38:46.4090933Z * [new branch] gh/jamesjwu/183/head -> origin/gh/jamesjwu/183/head 2025-09-07T07:38:46.4091506Z * [new branch] gh/jamesjwu/183/orig -> origin/gh/jamesjwu/183/orig 2025-09-07T07:38:46.4092176Z * [new branch] gh/jamesjwu/184/base -> origin/gh/jamesjwu/184/base 2025-09-07T07:38:46.4092588Z * [new branch] gh/jamesjwu/184/head -> origin/gh/jamesjwu/184/head 2025-09-07T07:38:46.4093196Z * [new branch] gh/jamesjwu/184/orig -> origin/gh/jamesjwu/184/orig 2025-09-07T07:38:46.4093884Z * [new branch] gh/jamesjwu/185/base -> origin/gh/jamesjwu/185/base 2025-09-07T07:38:46.4094375Z * [new branch] gh/jamesjwu/185/head -> origin/gh/jamesjwu/185/head 2025-09-07T07:38:46.4094946Z * [new branch] gh/jamesjwu/185/orig -> origin/gh/jamesjwu/185/orig 2025-09-07T07:38:46.4095607Z * [new branch] gh/jamesjwu/186/base -> origin/gh/jamesjwu/186/base 2025-09-07T07:38:46.4096018Z * [new branch] gh/jamesjwu/186/head -> origin/gh/jamesjwu/186/head 2025-09-07T07:38:46.4096623Z * [new branch] gh/jamesjwu/186/orig -> origin/gh/jamesjwu/186/orig 2025-09-07T07:38:46.4097294Z * [new branch] gh/jamesjwu/187/base -> origin/gh/jamesjwu/187/base 2025-09-07T07:38:46.4097708Z * [new branch] gh/jamesjwu/187/head -> origin/gh/jamesjwu/187/head 2025-09-07T07:38:46.4098330Z * [new branch] gh/jamesjwu/187/orig -> origin/gh/jamesjwu/187/orig 2025-09-07T07:38:46.4099198Z * [new branch] gh/jamesjwu/188/base -> origin/gh/jamesjwu/188/base 2025-09-07T07:38:46.4099664Z * [new branch] gh/jamesjwu/188/head -> origin/gh/jamesjwu/188/head 2025-09-07T07:38:46.4100251Z * [new branch] gh/jamesjwu/188/orig -> origin/gh/jamesjwu/188/orig 2025-09-07T07:38:46.4100908Z * [new branch] gh/jamesjwu/189/base -> origin/gh/jamesjwu/189/base 2025-09-07T07:38:46.4101485Z * [new branch] gh/jamesjwu/189/head -> origin/gh/jamesjwu/189/head 2025-09-07T07:38:46.4101931Z * [new branch] gh/jamesjwu/189/orig -> origin/gh/jamesjwu/189/orig 2025-09-07T07:38:46.4102934Z * [new branch] gh/jamesjwu/190/base -> origin/gh/jamesjwu/190/base 2025-09-07T07:38:46.4103347Z * [new branch] gh/jamesjwu/190/head -> origin/gh/jamesjwu/190/head 2025-09-07T07:38:46.4103918Z * [new branch] gh/jamesjwu/190/orig -> origin/gh/jamesjwu/190/orig 2025-09-07T07:38:46.4104748Z * [new branch] gh/jamesjwu/52/base -> origin/gh/jamesjwu/52/base 2025-09-07T07:38:46.4105209Z * [new branch] gh/jamesjwu/52/head -> origin/gh/jamesjwu/52/head 2025-09-07T07:38:46.4106015Z * [new branch] gh/jamesjwu/53/base -> origin/gh/jamesjwu/53/base 2025-09-07T07:38:46.4106799Z * [new branch] gh/jamesjwu/53/head -> origin/gh/jamesjwu/53/head 2025-09-07T07:38:46.4107461Z * [new branch] gh/jamesjwu/54/base -> origin/gh/jamesjwu/54/base 2025-09-07T07:38:46.4107822Z * [new branch] gh/jamesjwu/54/head -> origin/gh/jamesjwu/54/head 2025-09-07T07:38:46.4108671Z * [new branch] gh/jamesjwu/55/base -> origin/gh/jamesjwu/55/base 2025-09-07T07:38:46.4109069Z * [new branch] gh/jamesjwu/55/head -> origin/gh/jamesjwu/55/head 2025-09-07T07:38:46.4109798Z * [new branch] gh/jamesjwu/56/base -> origin/gh/jamesjwu/56/base 2025-09-07T07:38:46.4110197Z * [new branch] gh/jamesjwu/56/head -> origin/gh/jamesjwu/56/head 2025-09-07T07:38:46.4110893Z * [new branch] gh/jamesjwu/57/base -> origin/gh/jamesjwu/57/base 2025-09-07T07:38:46.4111287Z * [new branch] gh/jamesjwu/57/head -> origin/gh/jamesjwu/57/head 2025-09-07T07:38:46.4112088Z * [new branch] gh/jamesjwu/58/base -> origin/gh/jamesjwu/58/base 2025-09-07T07:38:46.4112435Z * [new branch] gh/jamesjwu/58/head -> origin/gh/jamesjwu/58/head 2025-09-07T07:38:46.4113315Z * [new branch] gh/jamesjwu/59/base -> origin/gh/jamesjwu/59/base 2025-09-07T07:38:46.4113597Z * [new branch] gh/jamesjwu/59/head -> origin/gh/jamesjwu/59/head 2025-09-07T07:38:46.4114274Z * [new branch] gh/jamesjwu/60/base -> origin/gh/jamesjwu/60/base 2025-09-07T07:38:46.4114675Z * [new branch] gh/jamesjwu/60/head -> origin/gh/jamesjwu/60/head 2025-09-07T07:38:46.4115423Z * [new branch] gh/jamesjwu/61/base -> origin/gh/jamesjwu/61/base 2025-09-07T07:38:46.4115790Z * [new branch] gh/jamesjwu/61/head -> origin/gh/jamesjwu/61/head 2025-09-07T07:38:46.4116638Z * [new branch] gh/jamesjwu/62/base -> origin/gh/jamesjwu/62/base 2025-09-07T07:38:46.4117027Z * [new branch] gh/jamesjwu/62/head -> origin/gh/jamesjwu/62/head 2025-09-07T07:38:46.4117798Z * [new branch] gh/jamesjwu/63/base -> origin/gh/jamesjwu/63/base 2025-09-07T07:38:46.4118401Z * [new branch] gh/jamesjwu/63/head -> origin/gh/jamesjwu/63/head 2025-09-07T07:38:46.4119130Z * [new branch] gh/jamesjwu/64/base -> origin/gh/jamesjwu/64/base 2025-09-07T07:38:46.4119549Z * [new branch] gh/jamesjwu/64/head -> origin/gh/jamesjwu/64/head 2025-09-07T07:38:46.4120290Z * [new branch] gh/jamesjwu/65/base -> origin/gh/jamesjwu/65/base 2025-09-07T07:38:46.4120665Z * [new branch] gh/jamesjwu/65/head -> origin/gh/jamesjwu/65/head 2025-09-07T07:38:46.4121743Z * [new branch] gh/janeyx99/165/base -> origin/gh/janeyx99/165/base 2025-09-07T07:38:46.4122212Z * [new branch] gh/janeyx99/165/head -> origin/gh/janeyx99/165/head 2025-09-07T07:38:46.4122780Z * [new branch] gh/janeyx99/165/orig -> origin/gh/janeyx99/165/orig 2025-09-07T07:38:46.4123405Z * [new branch] gh/janeyx99/201/base -> origin/gh/janeyx99/201/base 2025-09-07T07:38:46.4123860Z * [new branch] gh/janeyx99/201/head -> origin/gh/janeyx99/201/head 2025-09-07T07:38:46.4124428Z * [new branch] gh/janeyx99/201/orig -> origin/gh/janeyx99/201/orig 2025-09-07T07:38:46.4125699Z * [new branch] gh/janeyx99/225/base -> origin/gh/janeyx99/225/base 2025-09-07T07:38:46.4126312Z * [new branch] gh/janeyx99/225/head -> origin/gh/janeyx99/225/head 2025-09-07T07:38:46.4126756Z * [new branch] gh/janeyx99/225/orig -> origin/gh/janeyx99/225/orig 2025-09-07T07:38:46.4127576Z * [new branch] gh/janeyx99/296/base -> origin/gh/janeyx99/296/base 2025-09-07T07:38:46.4127991Z * [new branch] gh/janeyx99/296/head -> origin/gh/janeyx99/296/head 2025-09-07T07:38:46.4128576Z * [new branch] gh/janeyx99/296/orig -> origin/gh/janeyx99/296/orig 2025-09-07T07:38:46.4129270Z * [new branch] gh/janeyx99/297/base -> origin/gh/janeyx99/297/base 2025-09-07T07:38:46.4129722Z * [new branch] gh/janeyx99/297/head -> origin/gh/janeyx99/297/head 2025-09-07T07:38:46.4130297Z * [new branch] gh/janeyx99/297/orig -> origin/gh/janeyx99/297/orig 2025-09-07T07:38:46.4131057Z * [new branch] gh/janeyx99/298/base -> origin/gh/janeyx99/298/base 2025-09-07T07:38:46.4131473Z * [new branch] gh/janeyx99/298/head -> origin/gh/janeyx99/298/head 2025-09-07T07:38:46.4132068Z * [new branch] gh/janeyx99/298/orig -> origin/gh/janeyx99/298/orig 2025-09-07T07:38:46.4132838Z * [new branch] gh/janeyx99/299/base -> origin/gh/janeyx99/299/base 2025-09-07T07:38:46.4133276Z * [new branch] gh/janeyx99/299/head -> origin/gh/janeyx99/299/head 2025-09-07T07:38:46.4133850Z * [new branch] gh/janeyx99/299/orig -> origin/gh/janeyx99/299/orig 2025-09-07T07:38:46.4134664Z * [new branch] gh/janeyx99/300/base -> origin/gh/janeyx99/300/base 2025-09-07T07:38:46.4135437Z * [new branch] gh/janeyx99/300/head -> origin/gh/janeyx99/300/head 2025-09-07T07:38:46.4136009Z * [new branch] gh/janeyx99/300/orig -> origin/gh/janeyx99/300/orig 2025-09-07T07:38:46.4136749Z * [new branch] gh/janeyx99/301/base -> origin/gh/janeyx99/301/base 2025-09-07T07:38:46.4137198Z * [new branch] gh/janeyx99/301/head -> origin/gh/janeyx99/301/head 2025-09-07T07:38:46.4137780Z * [new branch] gh/janeyx99/301/orig -> origin/gh/janeyx99/301/orig 2025-09-07T07:38:46.4138385Z * [new branch] gh/janeyx99/302/base -> origin/gh/janeyx99/302/base 2025-09-07T07:38:46.4138848Z * [new branch] gh/janeyx99/302/head -> origin/gh/janeyx99/302/head 2025-09-07T07:38:46.4139603Z * [new branch] gh/janeyx99/303/base -> origin/gh/janeyx99/303/base 2025-09-07T07:38:46.4140018Z * [new branch] gh/janeyx99/303/head -> origin/gh/janeyx99/303/head 2025-09-07T07:38:46.4141251Z * [new branch] gh/janeyx99/88/base -> origin/gh/janeyx99/88/base 2025-09-07T07:38:46.4141658Z * [new branch] gh/janeyx99/88/head -> origin/gh/janeyx99/88/head 2025-09-07T07:38:46.4142243Z * [new branch] gh/janeyx99/88/orig -> origin/gh/janeyx99/88/orig 2025-09-07T07:38:46.4143146Z * [new branch] gh/jansel/360/base -> origin/gh/jansel/360/base 2025-09-07T07:38:46.4143573Z * [new branch] gh/jansel/360/head -> origin/gh/jansel/360/head 2025-09-07T07:38:46.4144328Z * [new branch] gh/jansel/451/base -> origin/gh/jansel/451/base 2025-09-07T07:38:46.4144981Z * [new branch] gh/jansel/451/head -> origin/gh/jansel/451/head 2025-09-07T07:38:46.4145419Z * [new branch] gh/jansel/451/orig -> origin/gh/jansel/451/orig 2025-09-07T07:38:46.4146198Z * [new branch] gh/jansel/462/base -> origin/gh/jansel/462/base 2025-09-07T07:38:46.4146757Z * [new branch] gh/jansel/462/head -> origin/gh/jansel/462/head 2025-09-07T07:38:46.4147190Z * [new branch] gh/jansel/462/orig -> origin/gh/jansel/462/orig 2025-09-07T07:38:46.4147981Z * [new branch] gh/jansel/531/base -> origin/gh/jansel/531/base 2025-09-07T07:38:46.4148391Z * [new branch] gh/jansel/531/head -> origin/gh/jansel/531/head 2025-09-07T07:38:46.4148974Z * [new branch] gh/jansel/531/orig -> origin/gh/jansel/531/orig 2025-09-07T07:38:46.4149962Z * [new branch] gh/jbschlosser/208/head -> origin/gh/jbschlosser/208/head 2025-09-07T07:38:46.4150720Z * [new branch] gh/jbschlosser/247/base -> origin/gh/jbschlosser/247/base 2025-09-07T07:38:46.4151124Z * [new branch] gh/jbschlosser/247/head -> origin/gh/jbschlosser/247/head 2025-09-07T07:38:46.4151760Z * [new branch] gh/jbschlosser/247/orig -> origin/gh/jbschlosser/247/orig 2025-09-07T07:38:46.4152482Z * [new branch] gh/jbschlosser/248/base -> origin/gh/jbschlosser/248/base 2025-09-07T07:38:46.4152935Z * [new branch] gh/jbschlosser/248/head -> origin/gh/jbschlosser/248/head 2025-09-07T07:38:46.4153567Z * [new branch] gh/jbschlosser/248/orig -> origin/gh/jbschlosser/248/orig 2025-09-07T07:38:46.4154511Z * [new branch] gh/jbschlosser/250/base -> origin/gh/jbschlosser/250/base 2025-09-07T07:38:46.4154917Z * [new branch] gh/jbschlosser/250/head -> origin/gh/jbschlosser/250/head 2025-09-07T07:38:46.4155506Z * [new branch] gh/jbschlosser/250/orig -> origin/gh/jbschlosser/250/orig 2025-09-07T07:38:46.4156358Z * [new branch] gh/jiayisunx/59/base -> origin/gh/jiayisunx/59/base 2025-09-07T07:38:46.4156788Z * [new branch] gh/jiayisunx/59/head -> origin/gh/jiayisunx/59/head 2025-09-07T07:38:46.4157407Z * [new branch] gh/jiayisunx/59/orig -> origin/gh/jiayisunx/59/orig 2025-09-07T07:38:46.4158034Z * [new branch] gh/jiayisunx/61/base -> origin/gh/jiayisunx/61/base 2025-09-07T07:38:46.4158483Z * [new branch] gh/jiayisunx/61/head -> origin/gh/jiayisunx/61/head 2025-09-07T07:38:46.4159082Z * [new branch] gh/jiayisunx/61/orig -> origin/gh/jiayisunx/61/orig 2025-09-07T07:38:46.4159739Z * [new branch] gh/jiayisunx/64/base -> origin/gh/jiayisunx/64/base 2025-09-07T07:38:46.4160303Z * [new branch] gh/jiayisunx/64/head -> origin/gh/jiayisunx/64/head 2025-09-07T07:38:46.4160731Z * [new branch] gh/jiayisunx/64/orig -> origin/gh/jiayisunx/64/orig 2025-09-07T07:38:46.4161507Z * [new branch] gh/jiayisunx/65/base -> origin/gh/jiayisunx/65/base 2025-09-07T07:38:46.4161953Z * [new branch] gh/jiayisunx/65/head -> origin/gh/jiayisunx/65/head 2025-09-07T07:38:46.4162639Z * [new branch] gh/jiayisunx/65/orig -> origin/gh/jiayisunx/65/orig 2025-09-07T07:38:46.4163360Z * [new branch] gh/jiayisunx/66/base -> origin/gh/jiayisunx/66/base 2025-09-07T07:38:46.4163765Z * [new branch] gh/jiayisunx/66/head -> origin/gh/jiayisunx/66/head 2025-09-07T07:38:46.4164347Z * [new branch] gh/jiayisunx/66/orig -> origin/gh/jiayisunx/66/orig 2025-09-07T07:38:46.4165126Z * [new branch] gh/jiayisunx/67/base -> origin/gh/jiayisunx/67/base 2025-09-07T07:38:46.4165528Z * [new branch] gh/jiayisunx/67/head -> origin/gh/jiayisunx/67/head 2025-09-07T07:38:46.4166113Z * [new branch] gh/jiayisunx/67/orig -> origin/gh/jiayisunx/67/orig 2025-09-07T07:38:46.4166794Z * [new branch] gh/jiayisunx/68/base -> origin/gh/jiayisunx/68/base 2025-09-07T07:38:46.4167412Z * [new branch] gh/jiayisunx/68/head -> origin/gh/jiayisunx/68/head 2025-09-07T07:38:46.4167748Z * [new branch] gh/jiayisunx/68/orig -> origin/gh/jiayisunx/68/orig 2025-09-07T07:38:46.4168552Z * [new branch] gh/jiayisunx/69/base -> origin/gh/jiayisunx/69/base 2025-09-07T07:38:46.4168945Z * [new branch] gh/jiayisunx/69/head -> origin/gh/jiayisunx/69/head 2025-09-07T07:38:46.4169539Z * [new branch] gh/jiayisunx/69/orig -> origin/gh/jiayisunx/69/orig 2025-09-07T07:38:46.4170235Z * [new branch] gh/jiayisunx/70/base -> origin/gh/jiayisunx/70/base 2025-09-07T07:38:46.4170653Z * [new branch] gh/jiayisunx/70/head -> origin/gh/jiayisunx/70/head 2025-09-07T07:38:46.4171384Z * [new branch] gh/jiayisunx/70/orig -> origin/gh/jiayisunx/70/orig 2025-09-07T07:38:46.4172096Z * [new branch] gh/jiayisunx/71/base -> origin/gh/jiayisunx/71/base 2025-09-07T07:38:46.4172493Z * [new branch] gh/jiayisunx/71/head -> origin/gh/jiayisunx/71/head 2025-09-07T07:38:46.4173069Z * [new branch] gh/jiayisunx/71/orig -> origin/gh/jiayisunx/71/orig 2025-09-07T07:38:46.4173804Z * [new branch] gh/jiayisunx/72/base -> origin/gh/jiayisunx/72/base 2025-09-07T07:38:46.4174376Z * [new branch] gh/jiayisunx/72/head -> origin/gh/jiayisunx/72/head 2025-09-07T07:38:46.4174811Z * [new branch] gh/jiayisunx/72/orig -> origin/gh/jiayisunx/72/orig 2025-09-07T07:38:46.4175622Z * [new branch] gh/jiayisunx/73/base -> origin/gh/jiayisunx/73/base 2025-09-07T07:38:46.4176056Z * [new branch] gh/jiayisunx/73/head -> origin/gh/jiayisunx/73/head 2025-09-07T07:38:46.4176667Z * [new branch] gh/jiayisunx/73/orig -> origin/gh/jiayisunx/73/orig 2025-09-07T07:38:46.4177332Z * [new branch] gh/jiayisunx/74/base -> origin/gh/jiayisunx/74/base 2025-09-07T07:38:46.4177783Z * [new branch] gh/jiayisunx/74/head -> origin/gh/jiayisunx/74/head 2025-09-07T07:38:46.4178377Z * [new branch] gh/jiayisunx/74/orig -> origin/gh/jiayisunx/74/orig 2025-09-07T07:38:46.4179077Z * [new branch] gh/jiayisunx/75/base -> origin/gh/jiayisunx/75/base 2025-09-07T07:38:46.4179478Z * [new branch] gh/jiayisunx/75/head -> origin/gh/jiayisunx/75/head 2025-09-07T07:38:46.4180071Z * [new branch] gh/jiayisunx/75/orig -> origin/gh/jiayisunx/75/orig 2025-09-07T07:38:46.4180704Z * [new branch] gh/jiayisunx/76/base -> origin/gh/jiayisunx/76/base 2025-09-07T07:38:46.4181169Z * [new branch] gh/jiayisunx/76/head -> origin/gh/jiayisunx/76/head 2025-09-07T07:38:46.4181760Z * [new branch] gh/jiayisunx/76/orig -> origin/gh/jiayisunx/76/orig 2025-09-07T07:38:46.4182621Z * [new branch] gh/jjwu@meta.com/1/base -> origin/gh/jjwu@meta.com/1/base 2025-09-07T07:38:46.4182998Z * [new branch] gh/jjwu@meta.com/1/head -> origin/gh/jjwu@meta.com/1/head 2025-09-07T07:38:46.4183946Z * [new branch] gh/justinchuby/111/base -> origin/gh/justinchuby/111/base 2025-09-07T07:38:46.4184425Z * [new branch] gh/justinchuby/111/head -> origin/gh/justinchuby/111/head 2025-09-07T07:38:46.4185098Z * [new branch] gh/justinchuby/111/orig -> origin/gh/justinchuby/111/orig 2025-09-07T07:38:46.4185861Z * [new branch] gh/justinchuby/112/base -> origin/gh/justinchuby/112/base 2025-09-07T07:38:46.4186238Z * [new branch] gh/justinchuby/112/head -> origin/gh/justinchuby/112/head 2025-09-07T07:38:46.4186862Z * [new branch] gh/justinchuby/112/orig -> origin/gh/justinchuby/112/orig 2025-09-07T07:38:46.4187567Z * [new branch] gh/justinchuby/113/base -> origin/gh/justinchuby/113/base 2025-09-07T07:38:46.4188459Z * [new branch] gh/justinchuby/113/head -> origin/gh/justinchuby/113/head 2025-09-07T07:38:46.4189055Z * [new branch] gh/justinchuby/113/orig -> origin/gh/justinchuby/113/orig 2025-09-07T07:38:46.4189808Z * [new branch] gh/justinchuby/114/base -> origin/gh/justinchuby/114/base 2025-09-07T07:38:46.4190243Z * [new branch] gh/justinchuby/114/head -> origin/gh/justinchuby/114/head 2025-09-07T07:38:46.4190834Z * [new branch] gh/justinchuby/114/orig -> origin/gh/justinchuby/114/orig 2025-09-07T07:38:46.4191544Z * [new branch] gh/justinchuby/115/base -> origin/gh/justinchuby/115/base 2025-09-07T07:38:46.4191962Z * [new branch] gh/justinchuby/115/head -> origin/gh/justinchuby/115/head 2025-09-07T07:38:46.4192443Z * [new branch] gh/justinchuby/115/orig -> origin/gh/justinchuby/115/orig 2025-09-07T07:38:46.4193343Z * [new branch] gh/karthickai/1/base -> origin/gh/karthickai/1/base 2025-09-07T07:38:46.4193764Z * [new branch] gh/karthickai/1/head -> origin/gh/karthickai/1/head 2025-09-07T07:38:46.4194372Z * [new branch] gh/karthickai/1/orig -> origin/gh/karthickai/1/orig 2025-09-07T07:38:46.4195105Z * [new branch] gh/karthickai/2/base -> origin/gh/karthickai/2/base 2025-09-07T07:38:46.4195528Z * [new branch] gh/karthickai/2/head -> origin/gh/karthickai/2/head 2025-09-07T07:38:46.4196102Z * [new branch] gh/karthickai/2/orig -> origin/gh/karthickai/2/orig 2025-09-07T07:38:46.4196980Z * [new branch] gh/kurtamohler/32/base -> origin/gh/kurtamohler/32/base 2025-09-07T07:38:46.4197373Z * [new branch] gh/kurtamohler/32/head -> origin/gh/kurtamohler/32/head 2025-09-07T07:38:46.4198003Z * [new branch] gh/kurtamohler/32/orig -> origin/gh/kurtamohler/32/orig 2025-09-07T07:38:46.4198825Z * [new branch] gh/kurtamohler/33/base -> origin/gh/kurtamohler/33/base 2025-09-07T07:38:46.4199241Z * [new branch] gh/kurtamohler/33/head -> origin/gh/kurtamohler/33/head 2025-09-07T07:38:46.4199806Z * [new branch] gh/kurtamohler/33/orig -> origin/gh/kurtamohler/33/orig 2025-09-07T07:38:46.4200567Z * [new branch] gh/kurtamohler/34/base -> origin/gh/kurtamohler/34/base 2025-09-07T07:38:46.4200997Z * [new branch] gh/kurtamohler/34/head -> origin/gh/kurtamohler/34/head 2025-09-07T07:38:46.4201584Z * [new branch] gh/kurtamohler/34/orig -> origin/gh/kurtamohler/34/orig 2025-09-07T07:38:46.4202597Z * [new branch] gh/kurtamohler/41/base -> origin/gh/kurtamohler/41/base 2025-09-07T07:38:46.4203053Z * [new branch] gh/kurtamohler/41/head -> origin/gh/kurtamohler/41/head 2025-09-07T07:38:46.4203647Z * [new branch] gh/kurtamohler/41/orig -> origin/gh/kurtamohler/41/orig 2025-09-07T07:38:46.4204274Z * [new branch] gh/kurtamohler/46/base -> origin/gh/kurtamohler/46/base 2025-09-07T07:38:46.4204717Z * [new branch] gh/kurtamohler/46/head -> origin/gh/kurtamohler/46/head 2025-09-07T07:38:46.4205302Z * [new branch] gh/kurtamohler/46/orig -> origin/gh/kurtamohler/46/orig 2025-09-07T07:38:46.4205990Z * [new branch] gh/kurtamohler/47/base -> origin/gh/kurtamohler/47/base 2025-09-07T07:38:46.4206579Z * [new branch] gh/kurtamohler/47/head -> origin/gh/kurtamohler/47/head 2025-09-07T07:38:46.4207177Z * [new branch] gh/kurtamohler/47/orig -> origin/gh/kurtamohler/47/orig 2025-09-07T07:38:46.4207885Z * [new branch] gh/kurtamohler/48/base -> origin/gh/kurtamohler/48/base 2025-09-07T07:38:46.4208288Z * [new branch] gh/kurtamohler/48/head -> origin/gh/kurtamohler/48/head 2025-09-07T07:38:46.4208911Z * [new branch] gh/kurtamohler/48/orig -> origin/gh/kurtamohler/48/orig 2025-09-07T07:38:46.4209595Z * [new branch] gh/kurtamohler/49/base -> origin/gh/kurtamohler/49/base 2025-09-07T07:38:46.4210024Z * [new branch] gh/kurtamohler/49/head -> origin/gh/kurtamohler/49/head 2025-09-07T07:38:46.4210688Z * [new branch] gh/kurtamohler/49/orig -> origin/gh/kurtamohler/49/orig 2025-09-07T07:38:46.4211330Z * [new branch] gh/kurtamohler/50/base -> origin/gh/kurtamohler/50/base 2025-09-07T07:38:46.4211735Z * [new branch] gh/kurtamohler/50/head -> origin/gh/kurtamohler/50/head 2025-09-07T07:38:46.4212313Z * [new branch] gh/kurtamohler/50/orig -> origin/gh/kurtamohler/50/orig 2025-09-07T07:38:46.4213301Z * [new branch] gh/kwen2501/130/base -> origin/gh/kwen2501/130/base 2025-09-07T07:38:46.4213924Z * [new branch] gh/kwen2501/130/head -> origin/gh/kwen2501/130/head 2025-09-07T07:38:46.4214527Z * [new branch] gh/kwen2501/130/orig -> origin/gh/kwen2501/130/orig 2025-09-07T07:38:46.4215239Z * [new branch] gh/kwen2501/15/base -> origin/gh/kwen2501/15/base 2025-09-07T07:38:46.4215813Z * [new branch] gh/kwen2501/15/head -> origin/gh/kwen2501/15/head 2025-09-07T07:38:46.4216617Z * [new branch] gh/kwen2501/156/base -> origin/gh/kwen2501/156/base 2025-09-07T07:38:46.4217039Z * [new branch] gh/kwen2501/156/head -> origin/gh/kwen2501/156/head 2025-09-07T07:38:46.4217602Z * [new branch] gh/kwen2501/156/orig -> origin/gh/kwen2501/156/orig 2025-09-07T07:38:46.4218299Z * [new branch] gh/kwen2501/170/base -> origin/gh/kwen2501/170/base 2025-09-07T07:38:46.4218745Z * [new branch] gh/kwen2501/170/head -> origin/gh/kwen2501/170/head 2025-09-07T07:38:46.4219587Z * [new branch] gh/kwen2501/186/base -> origin/gh/kwen2501/186/base 2025-09-07T07:38:46.4220049Z * [new branch] gh/kwen2501/186/head -> origin/gh/kwen2501/186/head 2025-09-07T07:38:46.4220610Z * [new branch] gh/kwen2501/186/orig -> origin/gh/kwen2501/186/orig 2025-09-07T07:38:46.4221231Z * [new branch] gh/kwen2501/187/base -> origin/gh/kwen2501/187/base 2025-09-07T07:38:46.4221746Z * [new branch] gh/kwen2501/187/head -> origin/gh/kwen2501/187/head 2025-09-07T07:38:46.4222319Z * [new branch] gh/kwen2501/187/orig -> origin/gh/kwen2501/187/orig 2025-09-07T07:38:46.4223081Z * [new branch] gh/kwen2501/188/base -> origin/gh/kwen2501/188/base 2025-09-07T07:38:46.4223497Z * [new branch] gh/kwen2501/188/head -> origin/gh/kwen2501/188/head 2025-09-07T07:38:46.4224118Z * [new branch] gh/kwen2501/188/orig -> origin/gh/kwen2501/188/orig 2025-09-07T07:38:46.4224853Z * [new branch] gh/kwen2501/194/base -> origin/gh/kwen2501/194/base 2025-09-07T07:38:46.4225464Z * [new branch] gh/kwen2501/194/head -> origin/gh/kwen2501/194/head 2025-09-07T07:38:46.4225892Z * [new branch] gh/kwen2501/194/orig -> origin/gh/kwen2501/194/orig 2025-09-07T07:38:46.4226657Z * [new branch] gh/kwen2501/199/base -> origin/gh/kwen2501/199/base 2025-09-07T07:38:46.4227066Z * [new branch] gh/kwen2501/199/head -> origin/gh/kwen2501/199/head 2025-09-07T07:38:46.4227978Z * [new branch] gh/kwen2501/199/orig -> origin/gh/kwen2501/199/orig 2025-09-07T07:38:46.4228605Z * [new branch] gh/kwen2501/200/base -> origin/gh/kwen2501/200/base 2025-09-07T07:38:46.4229066Z * [new branch] gh/kwen2501/200/head -> origin/gh/kwen2501/200/head 2025-09-07T07:38:46.4229632Z * [new branch] gh/kwen2501/200/orig -> origin/gh/kwen2501/200/orig 2025-09-07T07:38:46.4230404Z * [new branch] gh/kwen2501/201/base -> origin/gh/kwen2501/201/base 2025-09-07T07:38:46.4230808Z * [new branch] gh/kwen2501/201/head -> origin/gh/kwen2501/201/head 2025-09-07T07:38:46.4231426Z * [new branch] gh/kwen2501/201/orig -> origin/gh/kwen2501/201/orig 2025-09-07T07:38:46.4232131Z * [new branch] gh/kwen2501/203/base -> origin/gh/kwen2501/203/base 2025-09-07T07:38:46.4232556Z * [new branch] gh/kwen2501/203/head -> origin/gh/kwen2501/203/head 2025-09-07T07:38:46.4233124Z * [new branch] gh/kwen2501/203/orig -> origin/gh/kwen2501/203/orig 2025-09-07T07:38:46.4233813Z * [new branch] gh/kwen2501/204/base -> origin/gh/kwen2501/204/base 2025-09-07T07:38:46.4234432Z * [new branch] gh/kwen2501/204/head -> origin/gh/kwen2501/204/head 2025-09-07T07:38:46.4234842Z * [new branch] gh/kwen2501/204/orig -> origin/gh/kwen2501/204/orig 2025-09-07T07:38:46.4235641Z * [new branch] gh/kwen2501/205/base -> origin/gh/kwen2501/205/base 2025-09-07T07:38:46.4236050Z * [new branch] gh/kwen2501/205/head -> origin/gh/kwen2501/205/head 2025-09-07T07:38:46.4236626Z * [new branch] gh/kwen2501/205/orig -> origin/gh/kwen2501/205/orig 2025-09-07T07:38:46.4237382Z * [new branch] gh/kwen2501/206/base -> origin/gh/kwen2501/206/base 2025-09-07T07:38:46.4238195Z * [new branch] gh/kwen2501/206/head -> origin/gh/kwen2501/206/head 2025-09-07T07:38:46.4238640Z * [new branch] gh/kwen2501/206/orig -> origin/gh/kwen2501/206/orig 2025-09-07T07:38:46.4239401Z * [new branch] gh/kwen2501/207/base -> origin/gh/kwen2501/207/base 2025-09-07T07:38:46.4239796Z * [new branch] gh/kwen2501/207/head -> origin/gh/kwen2501/207/head 2025-09-07T07:38:46.4240380Z * [new branch] gh/kwen2501/207/orig -> origin/gh/kwen2501/207/orig 2025-09-07T07:38:46.4241052Z * [new branch] gh/kwen2501/208/base -> origin/gh/kwen2501/208/base 2025-09-07T07:38:46.4241524Z * [new branch] gh/kwen2501/208/head -> origin/gh/kwen2501/208/head 2025-09-07T07:38:46.4242057Z * [new branch] gh/kwen2501/208/orig -> origin/gh/kwen2501/208/orig 2025-09-07T07:38:46.4243129Z * [new branch] gh/kwen2501/209/base -> origin/gh/kwen2501/209/base 2025-09-07T07:38:46.4243744Z * [new branch] gh/kwen2501/209/head -> origin/gh/kwen2501/209/head 2025-09-07T07:38:46.4244151Z * [new branch] gh/kwen2501/209/orig -> origin/gh/kwen2501/209/orig 2025-09-07T07:38:46.4245024Z * [new branch] gh/kwen2501/210/base -> origin/gh/kwen2501/210/base 2025-09-07T07:38:46.4245416Z * [new branch] gh/kwen2501/210/head -> origin/gh/kwen2501/210/head 2025-09-07T07:38:46.4245998Z * [new branch] gh/kwen2501/210/orig -> origin/gh/kwen2501/210/orig 2025-09-07T07:38:46.4246735Z * [new branch] gh/kwen2501/211/base -> origin/gh/kwen2501/211/base 2025-09-07T07:38:46.4247175Z * [new branch] gh/kwen2501/211/head -> origin/gh/kwen2501/211/head 2025-09-07T07:38:46.4247954Z * [new branch] gh/kwen2501/212/base -> origin/gh/kwen2501/212/base 2025-09-07T07:38:46.4248370Z * [new branch] gh/kwen2501/212/head -> origin/gh/kwen2501/212/head 2025-09-07T07:38:46.4248947Z * [new branch] gh/kwen2501/212/orig -> origin/gh/kwen2501/212/orig 2025-09-07T07:38:46.4249629Z * [new branch] gh/kwen2501/213/base -> origin/gh/kwen2501/213/base 2025-09-07T07:38:46.4250049Z * [new branch] gh/kwen2501/213/head -> origin/gh/kwen2501/213/head 2025-09-07T07:38:46.4250609Z * [new branch] gh/kwen2501/213/orig -> origin/gh/kwen2501/213/orig 2025-09-07T07:38:46.4251420Z * [new branch] gh/kwen2501/214/base -> origin/gh/kwen2501/214/base 2025-09-07T07:38:46.4252020Z * [new branch] gh/kwen2501/214/head -> origin/gh/kwen2501/214/head 2025-09-07T07:38:46.4252954Z * [new branch] gh/kwen2501/214/orig -> origin/gh/kwen2501/214/orig 2025-09-07T07:38:46.4253690Z * [new branch] gh/kwen2501/215/base -> origin/gh/kwen2501/215/base 2025-09-07T07:38:46.4254097Z * [new branch] gh/kwen2501/215/head -> origin/gh/kwen2501/215/head 2025-09-07T07:38:46.4254695Z * [new branch] gh/kwen2501/215/orig -> origin/gh/kwen2501/215/orig 2025-09-07T07:38:46.4255362Z * [new branch] gh/kwen2501/216/base -> origin/gh/kwen2501/216/base 2025-09-07T07:38:46.4255770Z * [new branch] gh/kwen2501/216/head -> origin/gh/kwen2501/216/head 2025-09-07T07:38:46.4256342Z * [new branch] gh/kwen2501/216/orig -> origin/gh/kwen2501/216/orig 2025-09-07T07:38:46.4257006Z * [new branch] gh/kwen2501/217/base -> origin/gh/kwen2501/217/base 2025-09-07T07:38:46.4257425Z * [new branch] gh/kwen2501/217/head -> origin/gh/kwen2501/217/head 2025-09-07T07:38:46.4257999Z * [new branch] gh/kwen2501/217/orig -> origin/gh/kwen2501/217/orig 2025-09-07T07:38:46.4258758Z * [new branch] gh/kwen2501/218/base -> origin/gh/kwen2501/218/base 2025-09-07T07:38:46.4259224Z * [new branch] gh/kwen2501/218/head -> origin/gh/kwen2501/218/head 2025-09-07T07:38:46.4259790Z * [new branch] gh/kwen2501/218/orig -> origin/gh/kwen2501/218/orig 2025-09-07T07:38:46.4260471Z * [new branch] gh/kwen2501/219/base -> origin/gh/kwen2501/219/base 2025-09-07T07:38:46.4261047Z * [new branch] gh/kwen2501/219/head -> origin/gh/kwen2501/219/head 2025-09-07T07:38:46.4261483Z * [new branch] gh/kwen2501/219/orig -> origin/gh/kwen2501/219/orig 2025-09-07T07:38:46.4262331Z * [new branch] gh/kwen2501/220/base -> origin/gh/kwen2501/220/base 2025-09-07T07:38:46.4262714Z * [new branch] gh/kwen2501/220/head -> origin/gh/kwen2501/220/head 2025-09-07T07:38:46.4263298Z * [new branch] gh/kwen2501/220/orig -> origin/gh/kwen2501/220/orig 2025-09-07T07:38:46.4264012Z * [new branch] gh/kwen2501/221/base -> origin/gh/kwen2501/221/base 2025-09-07T07:38:46.4264407Z * [new branch] gh/kwen2501/221/head -> origin/gh/kwen2501/221/head 2025-09-07T07:38:46.4264909Z * [new branch] gh/kwen2501/221/orig -> origin/gh/kwen2501/221/orig 2025-09-07T07:38:46.4265711Z * [new branch] gh/kwen2501/222/base -> origin/gh/kwen2501/222/base 2025-09-07T07:38:46.4266101Z * [new branch] gh/kwen2501/222/head -> origin/gh/kwen2501/222/head 2025-09-07T07:38:46.4266708Z * [new branch] gh/kwen2501/222/orig -> origin/gh/kwen2501/222/orig 2025-09-07T07:38:46.4267373Z * [new branch] gh/kwen2501/223/base -> origin/gh/kwen2501/223/base 2025-09-07T07:38:46.4267779Z * [new branch] gh/kwen2501/223/head -> origin/gh/kwen2501/223/head 2025-09-07T07:38:46.4268345Z * [new branch] gh/kwen2501/223/orig -> origin/gh/kwen2501/223/orig 2025-09-07T07:38:46.4269023Z * [new branch] gh/kwen2501/224/base -> origin/gh/kwen2501/224/base 2025-09-07T07:38:46.4269635Z * [new branch] gh/kwen2501/224/head -> origin/gh/kwen2501/224/head 2025-09-07T07:38:46.4270425Z * [new branch] gh/kwen2501/224/orig -> origin/gh/kwen2501/224/orig 2025-09-07T07:38:46.4271154Z * [new branch] gh/kwen2501/225/base -> origin/gh/kwen2501/225/base 2025-09-07T07:38:46.4271547Z * [new branch] gh/kwen2501/225/head -> origin/gh/kwen2501/225/head 2025-09-07T07:38:46.4272122Z * [new branch] gh/kwen2501/225/orig -> origin/gh/kwen2501/225/orig 2025-09-07T07:38:46.4272868Z * [new branch] gh/kwen2501/226/base -> origin/gh/kwen2501/226/base 2025-09-07T07:38:46.4273299Z * [new branch] gh/kwen2501/226/head -> origin/gh/kwen2501/226/head 2025-09-07T07:38:46.4273893Z * [new branch] gh/kwen2501/226/orig -> origin/gh/kwen2501/226/orig 2025-09-07T07:38:46.4274552Z * [new branch] gh/kwen2501/227/base -> origin/gh/kwen2501/227/base 2025-09-07T07:38:46.4274943Z * [new branch] gh/kwen2501/227/head -> origin/gh/kwen2501/227/head 2025-09-07T07:38:46.4275506Z * [new branch] gh/kwen2501/227/orig -> origin/gh/kwen2501/227/orig 2025-09-07T07:38:46.4276218Z * [new branch] gh/kwen2501/228/base -> origin/gh/kwen2501/228/base 2025-09-07T07:38:46.4276671Z * [new branch] gh/kwen2501/228/head -> origin/gh/kwen2501/228/head 2025-09-07T07:38:46.4277196Z * [new branch] gh/kwen2501/228/orig -> origin/gh/kwen2501/228/orig 2025-09-07T07:38:46.4277924Z * [new branch] gh/kwen2501/229/base -> origin/gh/kwen2501/229/base 2025-09-07T07:38:46.4278559Z * [new branch] gh/kwen2501/229/head -> origin/gh/kwen2501/229/head 2025-09-07T07:38:46.4278991Z * [new branch] gh/kwen2501/229/orig -> origin/gh/kwen2501/229/orig 2025-09-07T07:38:46.4279871Z * [new branch] gh/kwen2501/230/base -> origin/gh/kwen2501/230/base 2025-09-07T07:38:46.4280281Z * [new branch] gh/kwen2501/230/head -> origin/gh/kwen2501/230/head 2025-09-07T07:38:46.4280858Z * [new branch] gh/kwen2501/230/orig -> origin/gh/kwen2501/230/orig 2025-09-07T07:38:46.4281655Z * [new branch] gh/kwen2501/231/base -> origin/gh/kwen2501/231/base 2025-09-07T07:38:46.4282054Z * [new branch] gh/kwen2501/231/head -> origin/gh/kwen2501/231/head 2025-09-07T07:38:46.4282651Z * [new branch] gh/kwen2501/231/orig -> origin/gh/kwen2501/231/orig 2025-09-07T07:38:46.4283335Z * [new branch] gh/kwen2501/232/base -> origin/gh/kwen2501/232/base 2025-09-07T07:38:46.4283794Z * [new branch] gh/kwen2501/232/head -> origin/gh/kwen2501/232/head 2025-09-07T07:38:46.4284361Z * [new branch] gh/kwen2501/232/orig -> origin/gh/kwen2501/232/orig 2025-09-07T07:38:46.4285467Z * [new branch] gh/laithsakka/156/base -> origin/gh/laithsakka/156/base 2025-09-07T07:38:46.4285819Z * [new branch] gh/laithsakka/156/head -> origin/gh/laithsakka/156/head 2025-09-07T07:38:46.4286412Z * [new branch] gh/laithsakka/156/orig -> origin/gh/laithsakka/156/orig 2025-09-07T07:38:46.4287358Z * [new branch] gh/laithsakka/160/base -> origin/gh/laithsakka/160/base 2025-09-07T07:38:46.4288021Z * [new branch] gh/laithsakka/160/head -> origin/gh/laithsakka/160/head 2025-09-07T07:38:46.4288421Z * [new branch] gh/laithsakka/160/orig -> origin/gh/laithsakka/160/orig 2025-09-07T07:38:46.4289239Z * [new branch] gh/laithsakka/178/base -> origin/gh/laithsakka/178/base 2025-09-07T07:38:46.4289716Z * [new branch] gh/laithsakka/178/head -> origin/gh/laithsakka/178/head 2025-09-07T07:38:46.4290328Z * [new branch] gh/laithsakka/178/orig -> origin/gh/laithsakka/178/orig 2025-09-07T07:38:46.4291054Z * [new branch] gh/laithsakka/191/base -> origin/gh/laithsakka/191/base 2025-09-07T07:38:46.4291442Z * [new branch] gh/laithsakka/191/head -> origin/gh/laithsakka/191/head 2025-09-07T07:38:46.4292041Z * [new branch] gh/laithsakka/191/orig -> origin/gh/laithsakka/191/orig 2025-09-07T07:38:46.4292730Z * [new branch] gh/laithsakka/237/base -> origin/gh/laithsakka/237/base 2025-09-07T07:38:46.4293151Z * [new branch] gh/laithsakka/237/head -> origin/gh/laithsakka/237/head 2025-09-07T07:38:46.4293783Z * [new branch] gh/laithsakka/237/orig -> origin/gh/laithsakka/237/orig 2025-09-07T07:38:46.4294556Z * [new branch] gh/laithsakka/249/base -> origin/gh/laithsakka/249/base 2025-09-07T07:38:46.4294997Z * [new branch] gh/laithsakka/249/head -> origin/gh/laithsakka/249/head 2025-09-07T07:38:46.4295572Z * [new branch] gh/laithsakka/249/orig -> origin/gh/laithsakka/249/orig 2025-09-07T07:38:46.4296703Z * [new branch] gh/laithsakka/251/base -> origin/gh/laithsakka/251/base 2025-09-07T07:38:46.4297135Z * [new branch] gh/laithsakka/251/head -> origin/gh/laithsakka/251/head 2025-09-07T07:38:46.4297711Z * [new branch] gh/laithsakka/251/orig -> origin/gh/laithsakka/251/orig 2025-09-07T07:38:46.4298462Z * [new branch] gh/laithsakka/254/base -> origin/gh/laithsakka/254/base 2025-09-07T07:38:46.4298861Z * [new branch] gh/laithsakka/254/head -> origin/gh/laithsakka/254/head 2025-09-07T07:38:46.4299463Z * [new branch] gh/laithsakka/254/orig -> origin/gh/laithsakka/254/orig 2025-09-07T07:38:46.4300263Z * [new branch] gh/laithsakka/255/base -> origin/gh/laithsakka/255/base 2025-09-07T07:38:46.4300638Z * [new branch] gh/laithsakka/255/head -> origin/gh/laithsakka/255/head 2025-09-07T07:38:46.4301234Z * [new branch] gh/laithsakka/255/orig -> origin/gh/laithsakka/255/orig 2025-09-07T07:38:46.4301909Z * [new branch] gh/laithsakka/256/base -> origin/gh/laithsakka/256/base 2025-09-07T07:38:46.4302376Z * [new branch] gh/laithsakka/256/head -> origin/gh/laithsakka/256/head 2025-09-07T07:38:46.4302825Z * [new branch] gh/laithsakka/256/orig -> origin/gh/laithsakka/256/orig 2025-09-07T07:38:46.4303630Z * [new branch] gh/laithsakka/257/base -> origin/gh/laithsakka/257/base 2025-09-07T07:38:46.4304034Z * [new branch] gh/laithsakka/257/head -> origin/gh/laithsakka/257/head 2025-09-07T07:38:46.4304616Z * [new branch] gh/laithsakka/257/orig -> origin/gh/laithsakka/257/orig 2025-09-07T07:38:46.4305472Z * [new branch] gh/laithsakka/258/base -> origin/gh/laithsakka/258/base 2025-09-07T07:38:46.4305933Z * [new branch] gh/laithsakka/258/head -> origin/gh/laithsakka/258/head 2025-09-07T07:38:46.4306402Z * [new branch] gh/laithsakka/258/orig -> origin/gh/laithsakka/258/orig 2025-09-07T07:38:46.4307221Z * [new branch] gh/laithsakka/259/base -> origin/gh/laithsakka/259/base 2025-09-07T07:38:46.4307665Z * [new branch] gh/laithsakka/259/head -> origin/gh/laithsakka/259/head 2025-09-07T07:38:46.4308237Z * [new branch] gh/laithsakka/259/orig -> origin/gh/laithsakka/259/orig 2025-09-07T07:38:46.4308967Z * [new branch] gh/laithsakka/260/base -> origin/gh/laithsakka/260/base 2025-09-07T07:38:46.4309377Z * [new branch] gh/laithsakka/260/head -> origin/gh/laithsakka/260/head 2025-09-07T07:38:46.4309989Z * [new branch] gh/laithsakka/260/orig -> origin/gh/laithsakka/260/orig 2025-09-07T07:38:46.4310681Z * [new branch] gh/laithsakka/261/base -> origin/gh/laithsakka/261/base 2025-09-07T07:38:46.4311152Z * [new branch] gh/laithsakka/261/head -> origin/gh/laithsakka/261/head 2025-09-07T07:38:46.4311721Z * [new branch] gh/laithsakka/261/orig -> origin/gh/laithsakka/261/orig 2025-09-07T07:38:46.4312697Z * [new branch] gh/laithsakka/262/base -> origin/gh/laithsakka/262/base 2025-09-07T07:38:46.4313513Z * [new branch] gh/laithsakka/262/head -> origin/gh/laithsakka/262/head 2025-09-07T07:38:46.4313969Z * [new branch] gh/laithsakka/262/orig -> origin/gh/laithsakka/262/orig 2025-09-07T07:38:46.4315205Z * [new branch] gh/laithsakka/263/base -> origin/gh/laithsakka/263/base 2025-09-07T07:38:46.4315620Z * [new branch] gh/laithsakka/263/head -> origin/gh/laithsakka/263/head 2025-09-07T07:38:46.4316169Z * [new branch] gh/laithsakka/263/orig -> origin/gh/laithsakka/263/orig 2025-09-07T07:38:46.4317005Z * [new branch] gh/laithsakka/264/base -> origin/gh/laithsakka/264/base 2025-09-07T07:38:46.4317405Z * [new branch] gh/laithsakka/264/head -> origin/gh/laithsakka/264/head 2025-09-07T07:38:46.4317982Z * [new branch] gh/laithsakka/264/orig -> origin/gh/laithsakka/264/orig 2025-09-07T07:38:46.4318761Z * [new branch] gh/laithsakka/265/base -> origin/gh/laithsakka/265/base 2025-09-07T07:38:46.4319175Z * [new branch] gh/laithsakka/265/head -> origin/gh/laithsakka/265/head 2025-09-07T07:38:46.4320061Z * [new branch] gh/laithsakka/265/orig -> origin/gh/laithsakka/265/orig 2025-09-07T07:38:46.4320790Z * [new branch] gh/laithsakka/266/base -> origin/gh/laithsakka/266/base 2025-09-07T07:38:46.4321196Z * [new branch] gh/laithsakka/266/head -> origin/gh/laithsakka/266/head 2025-09-07T07:38:46.4321805Z * [new branch] gh/laithsakka/266/orig -> origin/gh/laithsakka/266/orig 2025-09-07T07:38:46.4322498Z * [new branch] gh/laithsakka/267/base -> origin/gh/laithsakka/267/base 2025-09-07T07:38:46.4322899Z * [new branch] gh/laithsakka/267/head -> origin/gh/laithsakka/267/head 2025-09-07T07:38:46.4323470Z * [new branch] gh/laithsakka/267/orig -> origin/gh/laithsakka/267/orig 2025-09-07T07:38:46.4324385Z * [new branch] gh/laithsakka/268/base -> origin/gh/laithsakka/268/base 2025-09-07T07:38:46.4324805Z * [new branch] gh/laithsakka/268/head -> origin/gh/laithsakka/268/head 2025-09-07T07:38:46.4325413Z * [new branch] gh/laithsakka/268/orig -> origin/gh/laithsakka/268/orig 2025-09-07T07:38:46.4326230Z * [new branch] gh/laithsakka/28/base -> origin/gh/laithsakka/28/base 2025-09-07T07:38:46.4326859Z * [new branch] gh/laithsakka/29/base -> origin/gh/laithsakka/29/base 2025-09-07T07:38:46.4327538Z * [new branch] gh/laithsakka/30/base -> origin/gh/laithsakka/30/base 2025-09-07T07:38:46.4327962Z * [new branch] gh/laithsakka/30/head -> origin/gh/laithsakka/30/head 2025-09-07T07:38:46.4328693Z * [new branch] gh/laithsakka/31/base -> origin/gh/laithsakka/31/base 2025-09-07T07:38:46.4329098Z * [new branch] gh/laithsakka/31/head -> origin/gh/laithsakka/31/head 2025-09-07T07:38:46.4329822Z * [new branch] gh/laithsakka/32/base -> origin/gh/laithsakka/32/base 2025-09-07T07:38:46.4330250Z * [new branch] gh/laithsakka/32/head -> origin/gh/laithsakka/32/head 2025-09-07T07:38:46.4332636Z * [new branch] gh/lucaskabela/1/base -> origin/gh/lucaskabela/1/base 2025-09-07T07:38:46.4333102Z * [new branch] gh/lucaskabela/1/head -> origin/gh/lucaskabela/1/head 2025-09-07T07:38:46.4333990Z * [new branch] gh/lucaskabela/10/base -> origin/gh/lucaskabela/10/base 2025-09-07T07:38:46.4334479Z * [new branch] gh/lucaskabela/10/head -> origin/gh/lucaskabela/10/head 2025-09-07T07:38:46.4335188Z * [new branch] gh/lucaskabela/10/orig -> origin/gh/lucaskabela/10/orig 2025-09-07T07:38:46.4335842Z * [new branch] gh/lucaskabela/11/base -> origin/gh/lucaskabela/11/base 2025-09-07T07:38:46.4336275Z * [new branch] gh/lucaskabela/11/head -> origin/gh/lucaskabela/11/head 2025-09-07T07:38:46.4336872Z * [new branch] gh/lucaskabela/11/orig -> origin/gh/lucaskabela/11/orig 2025-09-07T07:38:46.4337492Z * [new branch] gh/lucaskabela/12/base -> origin/gh/lucaskabela/12/base 2025-09-07T07:38:46.4337931Z * [new branch] gh/lucaskabela/12/head -> origin/gh/lucaskabela/12/head 2025-09-07T07:38:46.4338582Z * [new branch] gh/lucaskabela/12/orig -> origin/gh/lucaskabela/12/orig 2025-09-07T07:38:46.4339217Z * [new branch] gh/lucaskabela/13/base -> origin/gh/lucaskabela/13/base 2025-09-07T07:38:46.4339641Z * [new branch] gh/lucaskabela/13/head -> origin/gh/lucaskabela/13/head 2025-09-07T07:38:46.4340226Z * [new branch] gh/lucaskabela/13/orig -> origin/gh/lucaskabela/13/orig 2025-09-07T07:38:46.4340884Z * [new branch] gh/lucaskabela/14/base -> origin/gh/lucaskabela/14/base 2025-09-07T07:38:46.4341312Z * [new branch] gh/lucaskabela/14/head -> origin/gh/lucaskabela/14/head 2025-09-07T07:38:46.4341898Z * [new branch] gh/lucaskabela/14/orig -> origin/gh/lucaskabela/14/orig 2025-09-07T07:38:46.4342558Z * [new branch] gh/lucaskabela/15/base -> origin/gh/lucaskabela/15/base 2025-09-07T07:38:46.4343201Z * [new branch] gh/lucaskabela/15/head -> origin/gh/lucaskabela/15/head 2025-09-07T07:38:46.4343630Z * [new branch] gh/lucaskabela/15/orig -> origin/gh/lucaskabela/15/orig 2025-09-07T07:38:46.4344389Z * [new branch] gh/lucaskabela/16/base -> origin/gh/lucaskabela/16/base 2025-09-07T07:38:46.4344777Z * [new branch] gh/lucaskabela/16/head -> origin/gh/lucaskabela/16/head 2025-09-07T07:38:46.4345424Z * [new branch] gh/lucaskabela/16/orig -> origin/gh/lucaskabela/16/orig 2025-09-07T07:38:46.4346074Z * [new branch] gh/lucaskabela/17/base -> origin/gh/lucaskabela/17/base 2025-09-07T07:38:46.4346514Z * [new branch] gh/lucaskabela/17/head -> origin/gh/lucaskabela/17/head 2025-09-07T07:38:46.4347089Z * [new branch] gh/lucaskabela/17/orig -> origin/gh/lucaskabela/17/orig 2025-09-07T07:38:46.4347795Z * [new branch] gh/lucaskabela/2/base -> origin/gh/lucaskabela/2/base 2025-09-07T07:38:46.4348221Z * [new branch] gh/lucaskabela/2/head -> origin/gh/lucaskabela/2/head 2025-09-07T07:38:46.4348806Z * [new branch] gh/lucaskabela/2/orig -> origin/gh/lucaskabela/2/orig 2025-09-07T07:38:46.4349600Z * [new branch] gh/lucaskabela/3/base -> origin/gh/lucaskabela/3/base 2025-09-07T07:38:46.4349969Z * [new branch] gh/lucaskabela/3/head -> origin/gh/lucaskabela/3/head 2025-09-07T07:38:46.4350561Z * [new branch] gh/lucaskabela/3/orig -> origin/gh/lucaskabela/3/orig 2025-09-07T07:38:46.4351256Z * [new branch] gh/lucaskabela/4/base -> origin/gh/lucaskabela/4/base 2025-09-07T07:38:46.4351849Z * [new branch] gh/lucaskabela/4/head -> origin/gh/lucaskabela/4/head 2025-09-07T07:38:46.4352426Z * [new branch] gh/lucaskabela/4/orig -> origin/gh/lucaskabela/4/orig 2025-09-07T07:38:46.4353167Z * [new branch] gh/lucaskabela/5/base -> origin/gh/lucaskabela/5/base 2025-09-07T07:38:46.4353547Z * [new branch] gh/lucaskabela/5/head -> origin/gh/lucaskabela/5/head 2025-09-07T07:38:46.4354132Z * [new branch] gh/lucaskabela/5/orig -> origin/gh/lucaskabela/5/orig 2025-09-07T07:38:46.4354770Z * [new branch] gh/lucaskabela/6/base -> origin/gh/lucaskabela/6/base 2025-09-07T07:38:46.4355187Z * [new branch] gh/lucaskabela/6/head -> origin/gh/lucaskabela/6/head 2025-09-07T07:38:46.4355759Z * [new branch] gh/lucaskabela/6/orig -> origin/gh/lucaskabela/6/orig 2025-09-07T07:38:46.4356551Z * [new branch] gh/lucaskabela/7/base -> origin/gh/lucaskabela/7/base 2025-09-07T07:38:46.4356939Z * [new branch] gh/lucaskabela/7/head -> origin/gh/lucaskabela/7/head 2025-09-07T07:38:46.4357524Z * [new branch] gh/lucaskabela/7/orig -> origin/gh/lucaskabela/7/orig 2025-09-07T07:38:46.4358184Z * [new branch] gh/lucaskabela/8/base -> origin/gh/lucaskabela/8/base 2025-09-07T07:38:46.4358657Z * [new branch] gh/lucaskabela/8/head -> origin/gh/lucaskabela/8/head 2025-09-07T07:38:46.4359319Z * [new branch] gh/lucaskabela/8/orig -> origin/gh/lucaskabela/8/orig 2025-09-07T07:38:46.4360011Z * [new branch] gh/lucaskabela/9/base -> origin/gh/lucaskabela/9/base 2025-09-07T07:38:46.4360662Z * [new branch] gh/lucaskabela/9/head -> origin/gh/lucaskabela/9/head 2025-09-07T07:38:46.4361070Z * [new branch] gh/lucaskabela/9/orig -> origin/gh/lucaskabela/9/orig 2025-09-07T07:38:46.4362044Z * [new branch] gh/lw/3/base -> origin/gh/lw/3/base 2025-09-07T07:38:46.4362441Z * [new branch] gh/lw/3/head -> origin/gh/lw/3/head 2025-09-07T07:38:46.4363071Z * [new branch] gh/lw/3/orig -> origin/gh/lw/3/orig 2025-09-07T07:38:46.4363946Z * [new branch] gh/malfet/14/base -> origin/gh/malfet/14/base 2025-09-07T07:38:46.4364656Z * [new branch] gh/malfet/330/base -> origin/gh/malfet/330/base 2025-09-07T07:38:46.4365108Z * [new branch] gh/malfet/330/head -> origin/gh/malfet/330/head 2025-09-07T07:38:46.4365768Z * [new branch] gh/malfet/330/orig -> origin/gh/malfet/330/orig 2025-09-07T07:38:46.4366464Z * [new branch] gh/malfet/396/base -> origin/gh/malfet/396/base 2025-09-07T07:38:46.4366870Z * [new branch] gh/malfet/396/head -> origin/gh/malfet/396/head 2025-09-07T07:38:46.4367470Z * [new branch] gh/malfet/396/orig -> origin/gh/malfet/396/orig 2025-09-07T07:38:46.4368161Z * [new branch] gh/malfet/397/base -> origin/gh/malfet/397/base 2025-09-07T07:38:46.4369044Z * [new branch] gh/malfet/397/head -> origin/gh/malfet/397/head 2025-09-07T07:38:46.4369440Z * [new branch] gh/malfet/397/orig -> origin/gh/malfet/397/orig 2025-09-07T07:38:46.4370353Z * [new branch] gh/malfet/398/base -> origin/gh/malfet/398/base 2025-09-07T07:38:46.4370805Z * [new branch] gh/malfet/398/head -> origin/gh/malfet/398/head 2025-09-07T07:38:46.4371406Z * [new branch] gh/malfet/398/orig -> origin/gh/malfet/398/orig 2025-09-07T07:38:46.4372025Z * [new branch] gh/malfet/399/base -> origin/gh/malfet/399/base 2025-09-07T07:38:46.4372455Z * [new branch] gh/malfet/399/head -> origin/gh/malfet/399/head 2025-09-07T07:38:46.4373112Z * [new branch] gh/malfet/399/orig -> origin/gh/malfet/399/orig 2025-09-07T07:38:46.4373798Z * [new branch] gh/malfet/414/base -> origin/gh/malfet/414/base 2025-09-07T07:38:46.4374220Z * [new branch] gh/malfet/414/head -> origin/gh/malfet/414/head 2025-09-07T07:38:46.4374879Z * [new branch] gh/malfet/414/orig -> origin/gh/malfet/414/orig 2025-09-07T07:38:46.4375521Z * [new branch] gh/malfet/417/base -> origin/gh/malfet/417/base 2025-09-07T07:38:46.4375940Z * [new branch] gh/malfet/417/head -> origin/gh/malfet/417/head 2025-09-07T07:38:46.4376541Z * [new branch] gh/malfet/417/orig -> origin/gh/malfet/417/orig 2025-09-07T07:38:46.4377198Z * [new branch] gh/malfet/418/base -> origin/gh/malfet/418/base 2025-09-07T07:38:46.4377607Z * [new branch] gh/malfet/418/head -> origin/gh/malfet/418/head 2025-09-07T07:38:46.4378184Z * [new branch] gh/malfet/418/orig -> origin/gh/malfet/418/orig 2025-09-07T07:38:46.4379057Z * [new branch] gh/malfet/475/base -> origin/gh/malfet/475/base 2025-09-07T07:38:46.4379709Z * [new branch] gh/malfet/475/head -> origin/gh/malfet/475/head 2025-09-07T07:38:46.4380111Z * [new branch] gh/malfet/475/orig -> origin/gh/malfet/475/orig 2025-09-07T07:38:46.4380947Z * [new branch] gh/malfet/476/base -> origin/gh/malfet/476/base 2025-09-07T07:38:46.4381539Z * [new branch] gh/malfet/476/head -> origin/gh/malfet/476/head 2025-09-07T07:38:46.4381954Z * [new branch] gh/malfet/476/orig -> origin/gh/malfet/476/orig 2025-09-07T07:38:46.4382677Z * [new branch] gh/malfet/477/base -> origin/gh/malfet/477/base 2025-09-07T07:38:46.4383122Z * [new branch] gh/malfet/477/head -> origin/gh/malfet/477/head 2025-09-07T07:38:46.4383754Z * [new branch] gh/malfet/477/orig -> origin/gh/malfet/477/orig 2025-09-07T07:38:46.4384411Z * [new branch] gh/malfet/478/base -> origin/gh/malfet/478/base 2025-09-07T07:38:46.4384822Z * [new branch] gh/malfet/478/head -> origin/gh/malfet/478/head 2025-09-07T07:38:46.4385459Z * [new branch] gh/malfet/478/orig -> origin/gh/malfet/478/orig 2025-09-07T07:38:46.4386039Z * [new branch] gh/malfet/479/base -> origin/gh/malfet/479/base 2025-09-07T07:38:46.4386675Z * [new branch] gh/malfet/479/head -> origin/gh/malfet/479/head 2025-09-07T07:38:46.4387301Z * [new branch] gh/malfet/479/orig -> origin/gh/malfet/479/orig 2025-09-07T07:38:46.4388019Z * [new branch] gh/malfet/480/base -> origin/gh/malfet/480/base 2025-09-07T07:38:46.4388415Z * [new branch] gh/malfet/480/head -> origin/gh/malfet/480/head 2025-09-07T07:38:46.4389059Z * [new branch] gh/malfet/480/orig -> origin/gh/malfet/480/orig 2025-09-07T07:38:46.4389743Z * [new branch] gh/malfet/481/base -> origin/gh/malfet/481/base 2025-09-07T07:38:46.4390200Z * [new branch] gh/malfet/481/head -> origin/gh/malfet/481/head 2025-09-07T07:38:46.4390774Z * [new branch] gh/malfet/481/orig -> origin/gh/malfet/481/orig 2025-09-07T07:38:46.4392050Z * [new branch] gh/malfet/482/base -> origin/gh/malfet/482/base 2025-09-07T07:38:46.4392238Z * [new branch] gh/malfet/482/head -> origin/gh/malfet/482/head 2025-09-07T07:38:46.4392360Z * [new branch] gh/malfet/482/orig -> origin/gh/malfet/482/orig 2025-09-07T07:38:46.4394824Z * [new branch] gh/malfet/483/base -> origin/gh/malfet/483/base 2025-09-07T07:38:46.4394988Z * [new branch] gh/malfet/483/head -> origin/gh/malfet/483/head 2025-09-07T07:38:46.4395106Z * [new branch] gh/malfet/483/orig -> origin/gh/malfet/483/orig 2025-09-07T07:38:46.4395583Z * [new branch] gh/malfet/484/base -> origin/gh/malfet/484/base 2025-09-07T07:38:46.4396058Z * [new branch] gh/malfet/484/head -> origin/gh/malfet/484/head 2025-09-07T07:38:46.4396818Z * [new branch] gh/malfet/484/orig -> origin/gh/malfet/484/orig 2025-09-07T07:38:46.4397522Z * [new branch] gh/malfet/485/base -> origin/gh/malfet/485/base 2025-09-07T07:38:46.4397995Z * [new branch] gh/malfet/485/head -> origin/gh/malfet/485/head 2025-09-07T07:38:46.4398650Z * [new branch] gh/malfet/485/orig -> origin/gh/malfet/485/orig 2025-09-07T07:38:46.4399376Z * [new branch] gh/malfet/486/base -> origin/gh/malfet/486/base 2025-09-07T07:38:46.4399818Z * [new branch] gh/malfet/486/head -> origin/gh/malfet/486/head 2025-09-07T07:38:46.4400341Z * [new branch] gh/malfet/486/orig -> origin/gh/malfet/486/orig 2025-09-07T07:38:46.4401050Z * [new branch] gh/malfet/487/base -> origin/gh/malfet/487/base 2025-09-07T07:38:46.4401430Z * [new branch] gh/malfet/487/head -> origin/gh/malfet/487/head 2025-09-07T07:38:46.4402028Z * [new branch] gh/malfet/487/orig -> origin/gh/malfet/487/orig 2025-09-07T07:38:46.4402812Z * [new branch] gh/malfet/488/base -> origin/gh/malfet/488/base 2025-09-07T07:38:46.4403208Z * [new branch] gh/malfet/488/head -> origin/gh/malfet/488/head 2025-09-07T07:38:46.4403771Z * [new branch] gh/malfet/488/orig -> origin/gh/malfet/488/orig 2025-09-07T07:38:46.4404527Z * [new branch] gh/malfet/489/base -> origin/gh/malfet/489/base 2025-09-07T07:38:46.4404928Z * [new branch] gh/malfet/489/head -> origin/gh/malfet/489/head 2025-09-07T07:38:46.4405759Z * [new branch] gh/malfet/489/orig -> origin/gh/malfet/489/orig 2025-09-07T07:38:46.4406446Z * [new branch] gh/malfet/490/base -> origin/gh/malfet/490/base 2025-09-07T07:38:46.4406851Z * [new branch] gh/malfet/490/head -> origin/gh/malfet/490/head 2025-09-07T07:38:46.4407471Z * [new branch] gh/malfet/490/orig -> origin/gh/malfet/490/orig 2025-09-07T07:38:46.4408205Z * [new branch] gh/malfet/491/base -> origin/gh/malfet/491/base 2025-09-07T07:38:46.4408838Z * [new branch] gh/malfet/491/head -> origin/gh/malfet/491/head 2025-09-07T07:38:46.4409255Z * [new branch] gh/malfet/491/orig -> origin/gh/malfet/491/orig 2025-09-07T07:38:46.4409982Z * [new branch] gh/malfet/492/base -> origin/gh/malfet/492/base 2025-09-07T07:38:46.4410431Z * [new branch] gh/malfet/492/head -> origin/gh/malfet/492/head 2025-09-07T07:38:46.4411052Z * [new branch] gh/malfet/492/orig -> origin/gh/malfet/492/orig 2025-09-07T07:38:46.4411786Z * [new branch] gh/malfet/493/base -> origin/gh/malfet/493/base 2025-09-07T07:38:46.4412177Z * [new branch] gh/malfet/493/head -> origin/gh/malfet/493/head 2025-09-07T07:38:46.4413078Z * [new branch] gh/malfet/493/orig -> origin/gh/malfet/493/orig 2025-09-07T07:38:46.4413813Z * [new branch] gh/malfet/494/base -> origin/gh/malfet/494/base 2025-09-07T07:38:46.4414186Z * [new branch] gh/malfet/494/head -> origin/gh/malfet/494/head 2025-09-07T07:38:46.4414932Z * [new branch] gh/malfet/494/orig -> origin/gh/malfet/494/orig 2025-09-07T07:38:46.4415563Z * [new branch] gh/malfet/495/base -> origin/gh/malfet/495/base 2025-09-07T07:38:46.4416198Z * [new branch] gh/malfet/495/head -> origin/gh/malfet/495/head 2025-09-07T07:38:46.4416767Z * [new branch] gh/malfet/495/orig -> origin/gh/malfet/495/orig 2025-09-07T07:38:46.4417443Z * [new branch] gh/malfet/496/base -> origin/gh/malfet/496/base 2025-09-07T07:38:46.4417853Z * [new branch] gh/malfet/496/head -> origin/gh/malfet/496/head 2025-09-07T07:38:46.4418447Z * [new branch] gh/malfet/496/orig -> origin/gh/malfet/496/orig 2025-09-07T07:38:46.4419126Z * [new branch] gh/malfet/497/base -> origin/gh/malfet/497/base 2025-09-07T07:38:46.4419531Z * [new branch] gh/malfet/497/head -> origin/gh/malfet/497/head 2025-09-07T07:38:46.4420216Z * [new branch] gh/malfet/497/orig -> origin/gh/malfet/497/orig 2025-09-07T07:38:46.4421225Z * [new branch] gh/malfet/498/base -> origin/gh/malfet/498/base 2025-09-07T07:38:46.4421650Z * [new branch] gh/malfet/498/head -> origin/gh/malfet/498/head 2025-09-07T07:38:46.4422214Z * [new branch] gh/malfet/498/orig -> origin/gh/malfet/498/orig 2025-09-07T07:38:46.4422863Z * [new branch] gh/malfet/499/base -> origin/gh/malfet/499/base 2025-09-07T07:38:46.4423313Z * [new branch] gh/malfet/499/head -> origin/gh/malfet/499/head 2025-09-07T07:38:46.4424058Z * [new branch] gh/malfet/499/orig -> origin/gh/malfet/499/orig 2025-09-07T07:38:46.4424779Z * [new branch] gh/malfet/500/base -> origin/gh/malfet/500/base 2025-09-07T07:38:46.4425188Z * [new branch] gh/malfet/500/head -> origin/gh/malfet/500/head 2025-09-07T07:38:46.4425857Z * [new branch] gh/malfet/500/orig -> origin/gh/malfet/500/orig 2025-09-07T07:38:46.4426529Z * [new branch] gh/malfet/501/base -> origin/gh/malfet/501/base 2025-09-07T07:38:46.4426948Z * [new branch] gh/malfet/501/head -> origin/gh/malfet/501/head 2025-09-07T07:38:46.4427544Z * [new branch] gh/malfet/501/orig -> origin/gh/malfet/501/orig 2025-09-07T07:38:46.4428275Z * [new branch] gh/malfet/502/base -> origin/gh/malfet/502/base 2025-09-07T07:38:46.4428673Z * [new branch] gh/malfet/502/head -> origin/gh/malfet/502/head 2025-09-07T07:38:46.4429249Z * [new branch] gh/malfet/502/orig -> origin/gh/malfet/502/orig 2025-09-07T07:38:46.4429969Z * [new branch] gh/malfet/503/base -> origin/gh/malfet/503/base 2025-09-07T07:38:46.4430383Z * [new branch] gh/malfet/503/head -> origin/gh/malfet/503/head 2025-09-07T07:38:46.4431037Z * [new branch] gh/malfet/503/orig -> origin/gh/malfet/503/orig 2025-09-07T07:38:46.4431765Z * [new branch] gh/malfet/504/base -> origin/gh/malfet/504/base 2025-09-07T07:38:46.4432181Z * [new branch] gh/malfet/504/head -> origin/gh/malfet/504/head 2025-09-07T07:38:46.4432885Z * [new branch] gh/malfet/504/orig -> origin/gh/malfet/504/orig 2025-09-07T07:38:46.4433672Z * [new branch] gh/malfet/505/base -> origin/gh/malfet/505/base 2025-09-07T07:38:46.4434074Z * [new branch] gh/malfet/505/head -> origin/gh/malfet/505/head 2025-09-07T07:38:46.4434684Z * [new branch] gh/malfet/505/orig -> origin/gh/malfet/505/orig 2025-09-07T07:38:46.4435424Z * [new branch] gh/malfet/506/base -> origin/gh/malfet/506/base 2025-09-07T07:38:46.4435787Z * [new branch] gh/malfet/506/head -> origin/gh/malfet/506/head 2025-09-07T07:38:46.4436353Z * [new branch] gh/malfet/506/orig -> origin/gh/malfet/506/orig 2025-09-07T07:38:46.4437067Z * [new branch] gh/malfet/507/base -> origin/gh/malfet/507/base 2025-09-07T07:38:46.4437469Z * [new branch] gh/malfet/507/head -> origin/gh/malfet/507/head 2025-09-07T07:38:46.4438101Z * [new branch] gh/malfet/507/orig -> origin/gh/malfet/507/orig 2025-09-07T07:38:46.4438933Z * [new branch] gh/malfet/508/base -> origin/gh/malfet/508/base 2025-09-07T07:38:46.4439343Z * [new branch] gh/malfet/508/head -> origin/gh/malfet/508/head 2025-09-07T07:38:46.4439957Z * [new branch] gh/malfet/508/orig -> origin/gh/malfet/508/orig 2025-09-07T07:38:46.4440615Z * [new branch] gh/malfet/509/base -> origin/gh/malfet/509/base 2025-09-07T07:38:46.4441231Z * [new branch] gh/malfet/509/head -> origin/gh/malfet/509/head 2025-09-07T07:38:46.4441789Z * [new branch] gh/malfet/509/orig -> origin/gh/malfet/509/orig 2025-09-07T07:38:46.4442547Z * [new branch] gh/malfet/510/base -> origin/gh/malfet/510/base 2025-09-07T07:38:46.4442939Z * [new branch] gh/malfet/510/head -> origin/gh/malfet/510/head 2025-09-07T07:38:46.4443541Z * [new branch] gh/malfet/510/orig -> origin/gh/malfet/510/orig 2025-09-07T07:38:46.4444243Z * [new branch] gh/malfet/511/base -> origin/gh/malfet/511/base 2025-09-07T07:38:46.4444654Z * [new branch] gh/malfet/511/head -> origin/gh/malfet/511/head 2025-09-07T07:38:46.4445322Z * [new branch] gh/malfet/511/orig -> origin/gh/malfet/511/orig 2025-09-07T07:38:46.4446027Z * [new branch] gh/malfet/512/base -> origin/gh/malfet/512/base 2025-09-07T07:38:46.4446462Z * [new branch] gh/malfet/512/head -> origin/gh/malfet/512/head 2025-09-07T07:38:46.4447024Z * [new branch] gh/malfet/512/orig -> origin/gh/malfet/512/orig 2025-09-07T07:38:46.4447723Z * [new branch] gh/malfet/513/base -> origin/gh/malfet/513/base 2025-09-07T07:38:46.4448141Z * [new branch] gh/malfet/513/head -> origin/gh/malfet/513/head 2025-09-07T07:38:46.4448752Z * [new branch] gh/malfet/513/orig -> origin/gh/malfet/513/orig 2025-09-07T07:38:46.4449468Z * [new branch] gh/malfet/64/base -> origin/gh/malfet/64/base 2025-09-07T07:38:46.4450073Z * [new branch] gh/malfet/64/head -> origin/gh/malfet/64/head 2025-09-07T07:38:46.4451052Z * [new branch] gh/manuelcandales/10/base -> origin/gh/manuelcandales/10/base 2025-09-07T07:38:46.4451481Z * [new branch] gh/manuelcandales/10/head -> origin/gh/manuelcandales/10/head 2025-09-07T07:38:46.4452119Z * [new branch] gh/manuelcandales/10/orig -> origin/gh/manuelcandales/10/orig 2025-09-07T07:38:46.4452835Z * [new branch] gh/manuelcandales/11/base -> origin/gh/manuelcandales/11/base 2025-09-07T07:38:46.4453245Z * [new branch] gh/manuelcandales/11/head -> origin/gh/manuelcandales/11/head 2025-09-07T07:38:46.4454138Z * [new branch] gh/manuelcandales/11/orig -> origin/gh/manuelcandales/11/orig 2025-09-07T07:38:46.4454826Z * [new branch] gh/manuelcandales/9/base -> origin/gh/manuelcandales/9/base 2025-09-07T07:38:46.4455228Z * [new branch] gh/manuelcandales/9/head -> origin/gh/manuelcandales/9/head 2025-09-07T07:38:46.4455813Z * [new branch] gh/manuelcandales/9/orig -> origin/gh/manuelcandales/9/orig 2025-09-07T07:38:46.4456868Z * [new branch] gh/markkm/1/base -> origin/gh/markkm/1/base 2025-09-07T07:38:46.4457812Z * [new branch] gh/masnesral/204/base -> origin/gh/masnesral/204/base 2025-09-07T07:38:46.4458387Z * [new branch] gh/masnesral/204/head -> origin/gh/masnesral/204/head 2025-09-07T07:38:46.4459001Z * [new branch] gh/masnesral/204/orig -> origin/gh/masnesral/204/orig 2025-09-07T07:38:46.4459792Z * [new branch] gh/masnesral/235/base -> origin/gh/masnesral/235/base 2025-09-07T07:38:46.4460454Z * [new branch] gh/masnesral/235/head -> origin/gh/masnesral/235/head 2025-09-07T07:38:46.4460929Z * [new branch] gh/masnesral/235/orig -> origin/gh/masnesral/235/orig 2025-09-07T07:38:46.4461735Z * [new branch] gh/masnesral/34/base -> origin/gh/masnesral/34/base 2025-09-07T07:38:46.4462642Z * [new branch] gh/mhorowitz/0/base -> origin/gh/mhorowitz/0/base 2025-09-07T07:38:46.4463212Z * [new branch] gh/mhorowitz/0/head -> origin/gh/mhorowitz/0/head 2025-09-07T07:38:46.4463825Z * [new branch] gh/mhorowitz/1/base -> origin/gh/mhorowitz/1/base 2025-09-07T07:38:46.4464278Z * [new branch] gh/mhorowitz/1/head -> origin/gh/mhorowitz/1/head 2025-09-07T07:38:46.4465013Z * [new branch] gh/mhorowitz/2/base -> origin/gh/mhorowitz/2/base 2025-09-07T07:38:46.4465591Z * [new branch] gh/mhorowitz/2/head -> origin/gh/mhorowitz/2/head 2025-09-07T07:38:46.4466230Z * [new branch] gh/mhorowitz/3/base -> origin/gh/mhorowitz/3/base 2025-09-07T07:38:46.4466646Z * [new branch] gh/mhorowitz/3/head -> origin/gh/mhorowitz/3/head 2025-09-07T07:38:46.4467338Z * [new branch] gh/mhorowitz/4/base -> origin/gh/mhorowitz/4/base 2025-09-07T07:38:46.4467724Z * [new branch] gh/mhorowitz/4/head -> origin/gh/mhorowitz/4/head 2025-09-07T07:38:46.4468441Z * [new branch] gh/mhorowitz/5/base -> origin/gh/mhorowitz/5/base 2025-09-07T07:38:46.4468845Z * [new branch] gh/mhorowitz/5/head -> origin/gh/mhorowitz/5/head 2025-09-07T07:38:46.4469677Z * [new branch] gh/mhorowitz/6/base -> origin/gh/mhorowitz/6/base 2025-09-07T07:38:46.4470068Z * [new branch] gh/mhorowitz/6/head -> origin/gh/mhorowitz/6/head 2025-09-07T07:38:46.4471086Z * [new branch] gh/mikaylagawarecki/234/base -> origin/gh/mikaylagawarecki/234/base 2025-09-07T07:38:46.4471687Z * [new branch] gh/mikaylagawarecki/234/head -> origin/gh/mikaylagawarecki/234/head 2025-09-07T07:38:46.4472322Z * [new branch] gh/mikaylagawarecki/235/base -> origin/gh/mikaylagawarecki/235/base 2025-09-07T07:38:46.4472698Z * [new branch] gh/mikaylagawarecki/235/head -> origin/gh/mikaylagawarecki/235/head 2025-09-07T07:38:46.4473530Z * [new branch] gh/mikaylagawarecki/236/base -> origin/gh/mikaylagawarecki/236/base 2025-09-07T07:38:46.4473860Z * [new branch] gh/mikaylagawarecki/236/head -> origin/gh/mikaylagawarecki/236/head 2025-09-07T07:38:46.4474638Z * [new branch] gh/mikaylagawarecki/237/base -> origin/gh/mikaylagawarecki/237/base 2025-09-07T07:38:46.4475008Z * [new branch] gh/mikaylagawarecki/237/head -> origin/gh/mikaylagawarecki/237/head 2025-09-07T07:38:46.4475811Z * [new branch] gh/mikaylagawarecki/238/base -> origin/gh/mikaylagawarecki/238/base 2025-09-07T07:38:46.4476240Z * [new branch] gh/mikaylagawarecki/238/head -> origin/gh/mikaylagawarecki/238/head 2025-09-07T07:38:46.4477019Z * [new branch] gh/mikaylagawarecki/317/base -> origin/gh/mikaylagawarecki/317/base 2025-09-07T07:38:46.4477491Z * [new branch] gh/mikaylagawarecki/317/head -> origin/gh/mikaylagawarecki/317/head 2025-09-07T07:38:46.4478501Z * [new branch] gh/mikaylagawarecki/317/orig -> origin/gh/mikaylagawarecki/317/orig 2025-09-07T07:38:46.4479379Z * [new branch] gh/mikaylagawarecki/320/base -> origin/gh/mikaylagawarecki/320/base 2025-09-07T07:38:46.4479783Z * [new branch] gh/mikaylagawarecki/320/head -> origin/gh/mikaylagawarecki/320/head 2025-09-07T07:38:46.4480374Z * [new branch] gh/mikaylagawarecki/320/orig -> origin/gh/mikaylagawarecki/320/orig 2025-09-07T07:38:46.4481114Z * [new branch] gh/mikaylagawarecki/329/base -> origin/gh/mikaylagawarecki/329/base 2025-09-07T07:38:46.4481597Z * [new branch] gh/mikaylagawarecki/329/head -> origin/gh/mikaylagawarecki/329/head 2025-09-07T07:38:46.4482080Z * [new branch] gh/mikaylagawarecki/329/orig -> origin/gh/mikaylagawarecki/329/orig 2025-09-07T07:38:46.4482905Z * [new branch] gh/mikaylagawarecki/330/base -> origin/gh/mikaylagawarecki/330/base 2025-09-07T07:38:46.4483322Z * [new branch] gh/mikaylagawarecki/330/head -> origin/gh/mikaylagawarecki/330/head 2025-09-07T07:38:46.4483922Z * [new branch] gh/mikaylagawarecki/330/orig -> origin/gh/mikaylagawarecki/330/orig 2025-09-07T07:38:46.4484673Z * [new branch] gh/mikaylagawarecki/331/base -> origin/gh/mikaylagawarecki/331/base 2025-09-07T07:38:46.4485107Z * [new branch] gh/mikaylagawarecki/331/head -> origin/gh/mikaylagawarecki/331/head 2025-09-07T07:38:46.4485761Z * [new branch] gh/mikaylagawarecki/331/orig -> origin/gh/mikaylagawarecki/331/orig 2025-09-07T07:38:46.4486673Z * [new branch] gh/mikaylagawarecki/332/base -> origin/gh/mikaylagawarecki/332/base 2025-09-07T07:38:46.4487316Z * [new branch] gh/mikaylagawarecki/332/head -> origin/gh/mikaylagawarecki/332/head 2025-09-07T07:38:46.4487646Z * [new branch] gh/mikaylagawarecki/332/orig -> origin/gh/mikaylagawarecki/332/orig 2025-09-07T07:38:46.4488487Z * [new branch] gh/mikaylagawarecki/334/base -> origin/gh/mikaylagawarecki/334/base 2025-09-07T07:38:46.4488881Z * [new branch] gh/mikaylagawarecki/334/head -> origin/gh/mikaylagawarecki/334/head 2025-09-07T07:38:46.4489482Z * [new branch] gh/mikaylagawarecki/334/orig -> origin/gh/mikaylagawarecki/334/orig 2025-09-07T07:38:46.4490182Z * [new branch] gh/mikaylagawarecki/335/base -> origin/gh/mikaylagawarecki/335/base 2025-09-07T07:38:46.4490653Z * [new branch] gh/mikaylagawarecki/335/head -> origin/gh/mikaylagawarecki/335/head 2025-09-07T07:38:46.4491260Z * [new branch] gh/mikaylagawarecki/335/orig -> origin/gh/mikaylagawarecki/335/orig 2025-09-07T07:38:46.4491991Z * [new branch] gh/mikaylagawarecki/336/base -> origin/gh/mikaylagawarecki/336/base 2025-09-07T07:38:46.4492445Z * [new branch] gh/mikaylagawarecki/336/head -> origin/gh/mikaylagawarecki/336/head 2025-09-07T07:38:46.4493104Z * [new branch] gh/mikaylagawarecki/336/orig -> origin/gh/mikaylagawarecki/336/orig 2025-09-07T07:38:46.4493706Z * [new branch] gh/mikaylagawarecki/337/base -> origin/gh/mikaylagawarecki/337/base 2025-09-07T07:38:46.4494121Z * [new branch] gh/mikaylagawarecki/337/head -> origin/gh/mikaylagawarecki/337/head 2025-09-07T07:38:46.4494732Z * [new branch] gh/mikaylagawarecki/337/orig -> origin/gh/mikaylagawarecki/337/orig 2025-09-07T07:38:46.4495414Z * [new branch] gh/mikaylagawarecki/338/base -> origin/gh/mikaylagawarecki/338/base 2025-09-07T07:38:46.4495873Z * [new branch] gh/mikaylagawarecki/338/head -> origin/gh/mikaylagawarecki/338/head 2025-09-07T07:38:46.4496569Z * [new branch] gh/mikaylagawarecki/338/orig -> origin/gh/mikaylagawarecki/338/orig 2025-09-07T07:38:46.4497517Z * [new branch] gh/mikaylagawarecki/339/base -> origin/gh/mikaylagawarecki/339/base 2025-09-07T07:38:46.4497919Z * [new branch] gh/mikaylagawarecki/339/head -> origin/gh/mikaylagawarecki/339/head 2025-09-07T07:38:46.4498558Z * [new branch] gh/mikaylagawarecki/339/orig -> origin/gh/mikaylagawarecki/339/orig 2025-09-07T07:38:46.4499455Z * [new branch] gh/mlazos/1/base -> origin/gh/mlazos/1/base 2025-09-07T07:38:46.4500443Z * [new branch] gh/mlazos/1/head -> origin/gh/mlazos/1/head 2025-09-07T07:38:46.4500878Z * [new branch] gh/mlazos/1/orig -> origin/gh/mlazos/1/orig 2025-09-07T07:38:46.4501714Z * [new branch] gh/mlazos/12/base -> origin/gh/mlazos/12/base 2025-09-07T07:38:46.4502122Z * [new branch] gh/mlazos/12/head -> origin/gh/mlazos/12/head 2025-09-07T07:38:46.4502729Z * [new branch] gh/mlazos/12/orig -> origin/gh/mlazos/12/orig 2025-09-07T07:38:46.4503821Z * [new branch] gh/mlazos/13/base -> origin/gh/mlazos/13/base 2025-09-07T07:38:46.4504240Z * [new branch] gh/mlazos/13/head -> origin/gh/mlazos/13/head 2025-09-07T07:38:46.4504868Z * [new branch] gh/mlazos/13/orig -> origin/gh/mlazos/13/orig 2025-09-07T07:38:46.4505602Z * [new branch] gh/mlazos/14/base -> origin/gh/mlazos/14/base 2025-09-07T07:38:46.4506396Z * [new branch] gh/mlazos/14/head -> origin/gh/mlazos/14/head 2025-09-07T07:38:46.4507023Z * [new branch] gh/mlazos/14/orig -> origin/gh/mlazos/14/orig 2025-09-07T07:38:46.4507817Z * [new branch] gh/mlazos/15/base -> origin/gh/mlazos/15/base 2025-09-07T07:38:46.4508379Z * [new branch] gh/mlazos/15/head -> origin/gh/mlazos/15/head 2025-09-07T07:38:46.4508789Z * [new branch] gh/mlazos/15/orig -> origin/gh/mlazos/15/orig 2025-09-07T07:38:46.4509619Z * [new branch] gh/mlazos/16/base -> origin/gh/mlazos/16/base 2025-09-07T07:38:46.4510127Z * [new branch] gh/mlazos/16/head -> origin/gh/mlazos/16/head 2025-09-07T07:38:46.4510699Z * [new branch] gh/mlazos/16/orig -> origin/gh/mlazos/16/orig 2025-09-07T07:38:46.4511366Z * [new branch] gh/mlazos/17/base -> origin/gh/mlazos/17/base 2025-09-07T07:38:46.4511791Z * [new branch] gh/mlazos/17/head -> origin/gh/mlazos/17/head 2025-09-07T07:38:46.4512250Z * [new branch] gh/mlazos/17/orig -> origin/gh/mlazos/17/orig 2025-09-07T07:38:46.4513077Z * [new branch] gh/mlazos/2/base -> origin/gh/mlazos/2/base 2025-09-07T07:38:46.4513454Z * [new branch] gh/mlazos/2/head -> origin/gh/mlazos/2/head 2025-09-07T07:38:46.4513921Z * [new branch] gh/mlazos/2/orig -> origin/gh/mlazos/2/orig 2025-09-07T07:38:46.4514798Z * [new branch] gh/mlazos/3/base -> origin/gh/mlazos/3/base 2025-09-07T07:38:46.4515255Z * [new branch] gh/mlazos/3/head -> origin/gh/mlazos/3/head 2025-09-07T07:38:46.4515957Z * [new branch] gh/mlazos/3/orig -> origin/gh/mlazos/3/orig 2025-09-07T07:38:46.4516899Z * [new branch] gh/mrmiywj/1/base -> origin/gh/mrmiywj/1/base 2025-09-07T07:38:46.4517503Z * [new branch] gh/mrmiywj/1/head -> origin/gh/mrmiywj/1/head 2025-09-07T07:38:46.4518379Z * [new branch] gh/muchulee8/62/base -> origin/gh/muchulee8/62/base 2025-09-07T07:38:46.4518968Z * [new branch] gh/muchulee8/62/head -> origin/gh/muchulee8/62/head 2025-09-07T07:38:46.4519421Z * [new branch] gh/muchulee8/62/orig -> origin/gh/muchulee8/62/orig 2025-09-07T07:38:46.4520192Z * [new branch] gh/muchulee8/63/base -> origin/gh/muchulee8/63/base 2025-09-07T07:38:46.4520634Z * [new branch] gh/muchulee8/63/head -> origin/gh/muchulee8/63/head 2025-09-07T07:38:46.4521306Z * [new branch] gh/muchulee8/63/orig -> origin/gh/muchulee8/63/orig 2025-09-07T07:38:46.4522231Z * [new branch] gh/muchulee8/64/base -> origin/gh/muchulee8/64/base 2025-09-07T07:38:46.4523076Z * [new branch] gh/muchulee8/64/head -> origin/gh/muchulee8/64/head 2025-09-07T07:38:46.4523523Z * [new branch] gh/muchulee8/64/orig -> origin/gh/muchulee8/64/orig 2025-09-07T07:38:46.4524431Z * [new branch] gh/muchulee8/65/base -> origin/gh/muchulee8/65/base 2025-09-07T07:38:46.4525011Z * [new branch] gh/muchulee8/65/head -> origin/gh/muchulee8/65/head 2025-09-07T07:38:46.4525696Z * [new branch] gh/muchulee8/65/orig -> origin/gh/muchulee8/65/orig 2025-09-07T07:38:46.4526663Z * [new branch] gh/naveenthangudu/1/base -> origin/gh/naveenthangudu/1/base 2025-09-07T07:38:46.4527420Z * [new branch] gh/naveenthangudu/1/head -> origin/gh/naveenthangudu/1/head 2025-09-07T07:38:46.4528026Z * [new branch] gh/naveenthangudu/1/orig -> origin/gh/naveenthangudu/1/orig 2025-09-07T07:38:46.4528977Z * [new branch] gh/naveenthangudu/2/base -> origin/gh/naveenthangudu/2/base 2025-09-07T07:38:46.4529428Z * [new branch] gh/naveenthangudu/2/head -> origin/gh/naveenthangudu/2/head 2025-09-07T07:38:46.4530015Z * [new branch] gh/naveenthangudu/2/orig -> origin/gh/naveenthangudu/2/orig 2025-09-07T07:38:46.4530771Z * [new branch] gh/naveenthangudu/3/base -> origin/gh/naveenthangudu/3/base 2025-09-07T07:38:46.4531220Z * [new branch] gh/naveenthangudu/3/head -> origin/gh/naveenthangudu/3/head 2025-09-07T07:38:46.4531820Z * [new branch] gh/naveenthangudu/3/orig -> origin/gh/naveenthangudu/3/orig 2025-09-07T07:38:46.4532502Z * [new branch] gh/naveenthangudu/4/base -> origin/gh/naveenthangudu/4/base 2025-09-07T07:38:46.4532901Z * [new branch] gh/naveenthangudu/4/head -> origin/gh/naveenthangudu/4/head 2025-09-07T07:38:46.4533605Z * [new branch] gh/naveenthangudu/4/orig -> origin/gh/naveenthangudu/4/orig 2025-09-07T07:38:46.4534293Z * [new branch] gh/naveenthangudu/5/base -> origin/gh/naveenthangudu/5/base 2025-09-07T07:38:46.4534729Z * [new branch] gh/naveenthangudu/5/head -> origin/gh/naveenthangudu/5/head 2025-09-07T07:38:46.4535532Z * [new branch] gh/naveenthangudu/5/orig -> origin/gh/naveenthangudu/5/orig 2025-09-07T07:38:46.4536243Z * [new branch] gh/naveenthangudu/6/base -> origin/gh/naveenthangudu/6/base 2025-09-07T07:38:46.4536654Z * [new branch] gh/naveenthangudu/6/head -> origin/gh/naveenthangudu/6/head 2025-09-07T07:38:46.4537108Z * [new branch] gh/naveenthangudu/6/orig -> origin/gh/naveenthangudu/6/orig 2025-09-07T07:38:46.4538109Z * [new branch] gh/oulgen/35/base -> origin/gh/oulgen/35/base 2025-09-07T07:38:46.4538545Z * [new branch] gh/oulgen/35/head -> origin/gh/oulgen/35/head 2025-09-07T07:38:46.4539117Z * [new branch] gh/oulgen/35/orig -> origin/gh/oulgen/35/orig 2025-09-07T07:38:46.4539830Z * [new branch] gh/oulgen/48/base -> origin/gh/oulgen/48/base 2025-09-07T07:38:46.4540251Z * [new branch] gh/oulgen/48/head -> origin/gh/oulgen/48/head 2025-09-07T07:38:46.4540867Z * [new branch] gh/oulgen/48/orig -> origin/gh/oulgen/48/orig 2025-09-07T07:38:46.4541513Z * [new branch] gh/oulgen/49/base -> origin/gh/oulgen/49/base 2025-09-07T07:38:46.4541963Z * [new branch] gh/oulgen/49/head -> origin/gh/oulgen/49/head 2025-09-07T07:38:46.4542582Z * [new branch] gh/oulgen/49/orig -> origin/gh/oulgen/49/orig 2025-09-07T07:38:46.4543620Z * [new branch] gh/pearu/108/base -> origin/gh/pearu/108/base 2025-09-07T07:38:46.4544339Z * [new branch] gh/pearu/108/head -> origin/gh/pearu/108/head 2025-09-07T07:38:46.4544805Z * [new branch] gh/pearu/108/orig -> origin/gh/pearu/108/orig 2025-09-07T07:38:46.4545927Z * [new branch] gh/pearu/109/base -> origin/gh/pearu/109/base 2025-09-07T07:38:46.4546368Z * [new branch] gh/pearu/109/head -> origin/gh/pearu/109/head 2025-09-07T07:38:46.4547002Z * [new branch] gh/pearu/109/orig -> origin/gh/pearu/109/orig 2025-09-07T07:38:46.4547682Z * [new branch] gh/pearu/110/base -> origin/gh/pearu/110/base 2025-09-07T07:38:46.4548298Z * [new branch] gh/pearu/110/head -> origin/gh/pearu/110/head 2025-09-07T07:38:46.4548628Z * [new branch] gh/pearu/110/orig -> origin/gh/pearu/110/orig 2025-09-07T07:38:46.4549454Z * [new branch] gh/pearu/111/base -> origin/gh/pearu/111/base 2025-09-07T07:38:46.4549833Z * [new branch] gh/pearu/111/head -> origin/gh/pearu/111/head 2025-09-07T07:38:46.4550495Z * [new branch] gh/pearu/111/orig -> origin/gh/pearu/111/orig 2025-09-07T07:38:46.4551389Z * [new branch] gh/pearu/112/base -> origin/gh/pearu/112/base 2025-09-07T07:38:46.4551919Z * [new branch] gh/pearu/112/head -> origin/gh/pearu/112/head 2025-09-07T07:38:46.4552373Z * [new branch] gh/pearu/112/orig -> origin/gh/pearu/112/orig 2025-09-07T07:38:46.4553525Z * [new branch] gh/pearu/113/base -> origin/gh/pearu/113/base 2025-09-07T07:38:46.4553959Z * [new branch] gh/pearu/113/head -> origin/gh/pearu/113/head 2025-09-07T07:38:46.4554654Z * [new branch] gh/pearu/113/orig -> origin/gh/pearu/113/orig 2025-09-07T07:38:46.4555387Z * [new branch] gh/pearu/114/base -> origin/gh/pearu/114/base 2025-09-07T07:38:46.4555793Z * [new branch] gh/pearu/114/head -> origin/gh/pearu/114/head 2025-09-07T07:38:46.4556466Z * [new branch] gh/pearu/114/orig -> origin/gh/pearu/114/orig 2025-09-07T07:38:46.4557182Z * [new branch] gh/pearu/115/base -> origin/gh/pearu/115/base 2025-09-07T07:38:46.4557638Z * [new branch] gh/pearu/115/head -> origin/gh/pearu/115/head 2025-09-07T07:38:46.4558083Z * [new branch] gh/pearu/115/orig -> origin/gh/pearu/115/orig 2025-09-07T07:38:46.4558983Z * [new branch] gh/pearu/116/base -> origin/gh/pearu/116/base 2025-09-07T07:38:46.4559412Z * [new branch] gh/pearu/116/head -> origin/gh/pearu/116/head 2025-09-07T07:38:46.4560034Z * [new branch] gh/pearu/116/orig -> origin/gh/pearu/116/orig 2025-09-07T07:38:46.4560707Z * [new branch] gh/pearu/117/base -> origin/gh/pearu/117/base 2025-09-07T07:38:46.4561101Z * [new branch] gh/pearu/117/head -> origin/gh/pearu/117/head 2025-09-07T07:38:46.4561545Z * [new branch] gh/pearu/117/orig -> origin/gh/pearu/117/orig 2025-09-07T07:38:46.4562639Z * [new branch] gh/pearu/56/base -> origin/gh/pearu/56/base 2025-09-07T07:38:46.4563277Z * [new branch] gh/pearu/56/head -> origin/gh/pearu/56/head 2025-09-07T07:38:46.4563896Z * [new branch] gh/pearu/56/orig -> origin/gh/pearu/56/orig 2025-09-07T07:38:46.4564726Z * [new branch] gh/pearu/97/base -> origin/gh/pearu/97/base 2025-09-07T07:38:46.4565166Z * [new branch] gh/pearu/97/head -> origin/gh/pearu/97/head 2025-09-07T07:38:46.4565753Z * [new branch] gh/pearu/97/orig -> origin/gh/pearu/97/orig 2025-09-07T07:38:46.4566739Z * [new branch] gh/qqaatw/29/base -> origin/gh/qqaatw/29/base 2025-09-07T07:38:46.4567218Z * [new branch] gh/qqaatw/29/head -> origin/gh/qqaatw/29/head 2025-09-07T07:38:46.4567671Z * [new branch] gh/qqaatw/29/orig -> origin/gh/qqaatw/29/orig 2025-09-07T07:38:46.4568471Z * [new branch] gh/raymo/refresh-script -> origin/gh/raymo/refresh-script 2025-09-07T07:38:46.4569309Z * [new branch] gh/rec/141/base -> origin/gh/rec/141/base 2025-09-07T07:38:46.4569769Z * [new branch] gh/rec/141/head -> origin/gh/rec/141/head 2025-09-07T07:38:46.4570527Z * [new branch] gh/rec/153/base -> origin/gh/rec/153/base 2025-09-07T07:38:46.4570929Z * [new branch] gh/rec/153/head -> origin/gh/rec/153/head 2025-09-07T07:38:46.4571562Z * [new branch] gh/rec/153/orig -> origin/gh/rec/153/orig 2025-09-07T07:38:46.4572260Z * [new branch] gh/rec/154/base -> origin/gh/rec/154/base 2025-09-07T07:38:46.4572694Z * [new branch] gh/rec/154/head -> origin/gh/rec/154/head 2025-09-07T07:38:46.4573452Z * [new branch] gh/rec/154/orig -> origin/gh/rec/154/orig 2025-09-07T07:38:46.4574134Z * [new branch] gh/rec/156/base -> origin/gh/rec/156/base 2025-09-07T07:38:46.4574538Z * [new branch] gh/rec/156/head -> origin/gh/rec/156/head 2025-09-07T07:38:46.4575095Z * [new branch] gh/rec/156/orig -> origin/gh/rec/156/orig 2025-09-07T07:38:46.4575859Z * [new branch] gh/rec/160/base -> origin/gh/rec/160/base 2025-09-07T07:38:46.4576277Z * [new branch] gh/rec/160/head -> origin/gh/rec/160/head 2025-09-07T07:38:46.4576874Z * [new branch] gh/rec/160/orig -> origin/gh/rec/160/orig 2025-09-07T07:38:46.4577611Z * [new branch] gh/rec/162/base -> origin/gh/rec/162/base 2025-09-07T07:38:46.4578015Z * [new branch] gh/rec/162/head -> origin/gh/rec/162/head 2025-09-07T07:38:46.4578605Z * [new branch] gh/rec/162/orig -> origin/gh/rec/162/orig 2025-09-07T07:38:46.4579265Z * [new branch] gh/rec/163/base -> origin/gh/rec/163/base 2025-09-07T07:38:46.4579676Z * [new branch] gh/rec/163/head -> origin/gh/rec/163/head 2025-09-07T07:38:46.4580335Z * [new branch] gh/rec/163/orig -> origin/gh/rec/163/orig 2025-09-07T07:38:46.4581033Z * [new branch] gh/rec/164/base -> origin/gh/rec/164/base 2025-09-07T07:38:46.4583780Z * [new branch] gh/rec/164/head -> origin/gh/rec/164/head 2025-09-07T07:38:46.4584360Z * [new branch] gh/rec/164/orig -> origin/gh/rec/164/orig 2025-09-07T07:38:46.4585095Z * [new branch] gh/rec/165/base -> origin/gh/rec/165/base 2025-09-07T07:38:46.4585531Z * [new branch] gh/rec/165/head -> origin/gh/rec/165/head 2025-09-07T07:38:46.4586114Z * [new branch] gh/rec/165/orig -> origin/gh/rec/165/orig 2025-09-07T07:38:46.4586839Z * [new branch] gh/rec/166/base -> origin/gh/rec/166/base 2025-09-07T07:38:46.4587237Z * [new branch] gh/rec/166/head -> origin/gh/rec/166/head 2025-09-07T07:38:46.4587851Z * [new branch] gh/rec/166/orig -> origin/gh/rec/166/orig 2025-09-07T07:38:46.4588817Z * [new branch] gh/robert-hardwick/1/base -> origin/gh/robert-hardwick/1/base 2025-09-07T07:38:46.4589433Z * [new branch] gh/robert-hardwick/1/head -> origin/gh/robert-hardwick/1/head 2025-09-07T07:38:46.4589857Z * [new branch] gh/robert-hardwick/1/orig -> origin/gh/robert-hardwick/1/orig 2025-09-07T07:38:46.4590691Z * [new branch] gh/robert-hardwick/2/base -> origin/gh/robert-hardwick/2/base 2025-09-07T07:38:46.4591182Z * [new branch] gh/robert-hardwick/2/head -> origin/gh/robert-hardwick/2/head 2025-09-07T07:38:46.4591814Z * [new branch] gh/robert-hardwick/2/orig -> origin/gh/robert-hardwick/2/orig 2025-09-07T07:38:46.4592533Z * [new branch] gh/robert-hardwick/3/base -> origin/gh/robert-hardwick/3/base 2025-09-07T07:38:46.4593164Z * [new branch] gh/robert-hardwick/3/head -> origin/gh/robert-hardwick/3/head 2025-09-07T07:38:46.4593593Z * [new branch] gh/robert-hardwick/3/orig -> origin/gh/robert-hardwick/3/orig 2025-09-07T07:38:46.4594701Z * [new branch] gh/robert-hardwick/4/base -> origin/gh/robert-hardwick/4/base 2025-09-07T07:38:46.4595113Z * [new branch] gh/robert-hardwick/4/head -> origin/gh/robert-hardwick/4/head 2025-09-07T07:38:46.4595734Z * [new branch] gh/robert-hardwick/4/orig -> origin/gh/robert-hardwick/4/orig 2025-09-07T07:38:46.4596672Z * [new branch] gh/rtimpe/1/base -> origin/gh/rtimpe/1/base 2025-09-07T07:38:46.4597101Z * [new branch] gh/rtimpe/1/head -> origin/gh/rtimpe/1/head 2025-09-07T07:38:46.4597936Z * [new branch] gh/rtimpe/10/base -> origin/gh/rtimpe/10/base 2025-09-07T07:38:46.4598330Z * [new branch] gh/rtimpe/10/head -> origin/gh/rtimpe/10/head 2025-09-07T07:38:46.4598930Z * [new branch] gh/rtimpe/10/orig -> origin/gh/rtimpe/10/orig 2025-09-07T07:38:46.4599671Z * [new branch] gh/rtimpe/11/base -> origin/gh/rtimpe/11/base 2025-09-07T07:38:46.4600065Z * [new branch] gh/rtimpe/11/head -> origin/gh/rtimpe/11/head 2025-09-07T07:38:46.4600676Z * [new branch] gh/rtimpe/11/orig -> origin/gh/rtimpe/11/orig 2025-09-07T07:38:46.4601338Z * [new branch] gh/rtimpe/12/base -> origin/gh/rtimpe/12/base 2025-09-07T07:38:46.4601764Z * [new branch] gh/rtimpe/12/head -> origin/gh/rtimpe/12/head 2025-09-07T07:38:46.4602478Z * [new branch] gh/rtimpe/12/orig -> origin/gh/rtimpe/12/orig 2025-09-07T07:38:46.4603209Z * [new branch] gh/rtimpe/13/base -> origin/gh/rtimpe/13/base 2025-09-07T07:38:46.4603774Z * [new branch] gh/rtimpe/13/head -> origin/gh/rtimpe/13/head 2025-09-07T07:38:46.4604201Z * [new branch] gh/rtimpe/13/orig -> origin/gh/rtimpe/13/orig 2025-09-07T07:38:46.4604955Z * [new branch] gh/rtimpe/14/base -> origin/gh/rtimpe/14/base 2025-09-07T07:38:46.4605361Z * [new branch] gh/rtimpe/14/head -> origin/gh/rtimpe/14/head 2025-09-07T07:38:46.4605983Z * [new branch] gh/rtimpe/14/orig -> origin/gh/rtimpe/14/orig 2025-09-07T07:38:46.4606656Z * [new branch] gh/rtimpe/15/base -> origin/gh/rtimpe/15/base 2025-09-07T07:38:46.4607062Z * [new branch] gh/rtimpe/15/head -> origin/gh/rtimpe/15/head 2025-09-07T07:38:46.4607677Z * [new branch] gh/rtimpe/15/orig -> origin/gh/rtimpe/15/orig 2025-09-07T07:38:46.4608346Z * [new branch] gh/rtimpe/2/base -> origin/gh/rtimpe/2/base 2025-09-07T07:38:46.4608723Z * [new branch] gh/rtimpe/2/head -> origin/gh/rtimpe/2/head 2025-09-07T07:38:46.4609468Z * [new branch] gh/rtimpe/3/base -> origin/gh/rtimpe/3/base 2025-09-07T07:38:46.4609872Z * [new branch] gh/rtimpe/3/head -> origin/gh/rtimpe/3/head 2025-09-07T07:38:46.4610652Z * [new branch] gh/rtimpe/4/base -> origin/gh/rtimpe/4/base 2025-09-07T07:38:46.4611271Z * [new branch] gh/rtimpe/4/head -> origin/gh/rtimpe/4/head 2025-09-07T07:38:46.4612003Z * [new branch] gh/rtimpe/9/base -> origin/gh/rtimpe/9/base 2025-09-07T07:38:46.4612401Z * [new branch] gh/rtimpe/9/head -> origin/gh/rtimpe/9/head 2025-09-07T07:38:46.4613051Z * [new branch] gh/rtimpe/9/orig -> origin/gh/rtimpe/9/orig 2025-09-07T07:38:46.4613956Z * [new branch] gh/ruisizhang123/1/base -> origin/gh/ruisizhang123/1/base 2025-09-07T07:38:46.4614399Z * [new branch] gh/ruisizhang123/1/head -> origin/gh/ruisizhang123/1/head 2025-09-07T07:38:46.4615007Z * [new branch] gh/ruisizhang123/1/orig -> origin/gh/ruisizhang123/1/orig 2025-09-07T07:38:46.4615716Z * [new branch] gh/ruisizhang123/4/base -> origin/gh/ruisizhang123/4/base 2025-09-07T07:38:46.4616076Z * [new branch] gh/ruisizhang123/4/head -> origin/gh/ruisizhang123/4/head 2025-09-07T07:38:46.4616654Z * [new branch] gh/ruisizhang123/4/orig -> origin/gh/ruisizhang123/4/orig 2025-09-07T07:38:46.4617727Z * [new branch] gh/ruisizhang123/5/base -> origin/gh/ruisizhang123/5/base 2025-09-07T07:38:46.4618169Z * [new branch] gh/ruisizhang123/5/head -> origin/gh/ruisizhang123/5/head 2025-09-07T07:38:46.4618761Z * [new branch] gh/ruisizhang123/5/orig -> origin/gh/ruisizhang123/5/orig 2025-09-07T07:38:46.4619437Z * [new branch] gh/ruisizhang123/6/base -> origin/gh/ruisizhang123/6/base 2025-09-07T07:38:46.4619881Z * [new branch] gh/ruisizhang123/6/head -> origin/gh/ruisizhang123/6/head 2025-09-07T07:38:46.4620618Z * [new branch] gh/ruisizhang123/6/orig -> origin/gh/ruisizhang123/6/orig 2025-09-07T07:38:46.4621302Z * [new branch] gh/ruisizhang123/7/base -> origin/gh/ruisizhang123/7/base 2025-09-07T07:38:46.4621723Z * [new branch] gh/ruisizhang123/7/head -> origin/gh/ruisizhang123/7/head 2025-09-07T07:38:46.4622294Z * [new branch] gh/ruisizhang123/7/orig -> origin/gh/ruisizhang123/7/orig 2025-09-07T07:38:46.4623000Z * [new branch] gh/ruisizhang123/8/base -> origin/gh/ruisizhang123/8/base 2025-09-07T07:38:46.4623401Z * [new branch] gh/ruisizhang123/8/head -> origin/gh/ruisizhang123/8/head 2025-09-07T07:38:46.4623993Z * [new branch] gh/ruisizhang123/8/orig -> origin/gh/ruisizhang123/8/orig 2025-09-07T07:38:46.4624777Z * [new branch] gh/ruisizhang123/9/base -> origin/gh/ruisizhang123/9/base 2025-09-07T07:38:46.4625201Z * [new branch] gh/ruisizhang123/9/head -> origin/gh/ruisizhang123/9/head 2025-09-07T07:38:46.4625820Z * [new branch] gh/ruisizhang123/9/orig -> origin/gh/ruisizhang123/9/orig 2025-09-07T07:38:46.4626744Z * [new branch] gh/sarckk/2/base -> origin/gh/sarckk/2/base 2025-09-07T07:38:46.4627158Z * [new branch] gh/sarckk/2/head -> origin/gh/sarckk/2/head 2025-09-07T07:38:46.4627764Z * [new branch] gh/sarckk/2/orig -> origin/gh/sarckk/2/orig 2025-09-07T07:38:46.4628644Z * [new branch] gh/seemethere/35/base -> origin/gh/seemethere/35/base 2025-09-07T07:38:46.4629272Z * [new branch] gh/seemethere/35/head -> origin/gh/seemethere/35/head 2025-09-07T07:38:46.4629726Z * [new branch] gh/seemethere/35/orig -> origin/gh/seemethere/35/orig 2025-09-07T07:38:46.4630511Z * [new branch] gh/seemethere/37/base -> origin/gh/seemethere/37/base 2025-09-07T07:38:46.4631133Z * [new branch] gh/seemethere/37/head -> origin/gh/seemethere/37/head 2025-09-07T07:38:46.4631552Z * [new branch] gh/seemethere/37/orig -> origin/gh/seemethere/37/orig 2025-09-07T07:38:46.4632320Z * [new branch] gh/seemethere/43/base -> origin/gh/seemethere/43/base 2025-09-07T07:38:46.4632785Z * [new branch] gh/seemethere/43/head -> origin/gh/seemethere/43/head 2025-09-07T07:38:46.4633369Z * [new branch] gh/seemethere/43/orig -> origin/gh/seemethere/43/orig 2025-09-07T07:38:46.4634059Z * [new branch] gh/seemethere/44/base -> origin/gh/seemethere/44/base 2025-09-07T07:38:46.4634432Z * [new branch] gh/seemethere/44/head -> origin/gh/seemethere/44/head 2025-09-07T07:38:46.4635008Z * [new branch] gh/seemethere/44/orig -> origin/gh/seemethere/44/orig 2025-09-07T07:38:46.4635690Z * [new branch] gh/seemethere/48/base -> origin/gh/seemethere/48/base 2025-09-07T07:38:46.4636103Z * [new branch] gh/seemethere/48/head -> origin/gh/seemethere/48/head 2025-09-07T07:38:46.4636702Z * [new branch] gh/seemethere/48/orig -> origin/gh/seemethere/48/orig 2025-09-07T07:38:46.4637392Z * [new branch] gh/seemethere/49/base -> origin/gh/seemethere/49/base 2025-09-07T07:38:46.4638063Z * [new branch] gh/seemethere/49/head -> origin/gh/seemethere/49/head 2025-09-07T07:38:46.4638527Z * [new branch] gh/seemethere/49/orig -> origin/gh/seemethere/49/orig 2025-09-07T07:38:46.4639286Z * [new branch] gh/seemethere/52/base -> origin/gh/seemethere/52/base 2025-09-07T07:38:46.4639749Z * [new branch] gh/seemethere/52/head -> origin/gh/seemethere/52/head 2025-09-07T07:38:46.4640366Z * [new branch] gh/seemethere/52/orig -> origin/gh/seemethere/52/orig 2025-09-07T07:38:46.4641066Z * [new branch] gh/seemethere/53/base -> origin/gh/seemethere/53/base 2025-09-07T07:38:46.4641448Z * [new branch] gh/seemethere/53/head -> origin/gh/seemethere/53/head 2025-09-07T07:38:46.4642032Z * [new branch] gh/seemethere/53/orig -> origin/gh/seemethere/53/orig 2025-09-07T07:38:46.4643012Z * [new branch] gh/seemethere/54/base -> origin/gh/seemethere/54/base 2025-09-07T07:38:46.4643431Z * [new branch] gh/seemethere/54/head -> origin/gh/seemethere/54/head 2025-09-07T07:38:46.4644081Z * [new branch] gh/seemethere/54/orig -> origin/gh/seemethere/54/orig 2025-09-07T07:38:46.4644713Z * [new branch] gh/seemethere/55/base -> origin/gh/seemethere/55/base 2025-09-07T07:38:46.4645143Z * [new branch] gh/seemethere/55/head -> origin/gh/seemethere/55/head 2025-09-07T07:38:46.4645723Z * [new branch] gh/seemethere/55/orig -> origin/gh/seemethere/55/orig 2025-09-07T07:38:46.4646460Z * [new branch] gh/seemethere/56/base -> origin/gh/seemethere/56/base 2025-09-07T07:38:46.4647041Z * [new branch] gh/seemethere/56/head -> origin/gh/seemethere/56/head 2025-09-07T07:38:46.4647494Z * [new branch] gh/seemethere/56/orig -> origin/gh/seemethere/56/orig 2025-09-07T07:38:46.4648310Z * [new branch] gh/seemethere/57/base -> origin/gh/seemethere/57/base 2025-09-07T07:38:46.4648764Z * [new branch] gh/seemethere/57/head -> origin/gh/seemethere/57/head 2025-09-07T07:38:46.4649337Z * [new branch] gh/seemethere/57/orig -> origin/gh/seemethere/57/orig 2025-09-07T07:38:46.4649996Z * [new branch] gh/seemethere/58/base -> origin/gh/seemethere/58/base 2025-09-07T07:38:46.4650392Z * [new branch] gh/seemethere/58/head -> origin/gh/seemethere/58/head 2025-09-07T07:38:46.4650998Z * [new branch] gh/seemethere/58/orig -> origin/gh/seemethere/58/orig 2025-09-07T07:38:46.4651652Z * [new branch] gh/seemethere/59/base -> origin/gh/seemethere/59/base 2025-09-07T07:38:46.4652125Z * [new branch] gh/seemethere/59/head -> origin/gh/seemethere/59/head 2025-09-07T07:38:46.4652728Z * [new branch] gh/seemethere/59/orig -> origin/gh/seemethere/59/orig 2025-09-07T07:38:46.4653420Z * [new branch] gh/seemethere/60/base -> origin/gh/seemethere/60/base 2025-09-07T07:38:46.4653822Z * [new branch] gh/seemethere/60/head -> origin/gh/seemethere/60/head 2025-09-07T07:38:46.4654434Z * [new branch] gh/seemethere/60/orig -> origin/gh/seemethere/60/orig 2025-09-07T07:38:46.4655108Z * [new branch] gh/seemethere/61/base -> origin/gh/seemethere/61/base 2025-09-07T07:38:46.4655695Z * [new branch] gh/seemethere/61/head -> origin/gh/seemethere/61/head 2025-09-07T07:38:46.4656142Z * [new branch] gh/seemethere/61/orig -> origin/gh/seemethere/61/orig 2025-09-07T07:38:46.4656904Z * [new branch] gh/seemethere/62/base -> origin/gh/seemethere/62/base 2025-09-07T07:38:46.4657337Z * [new branch] gh/seemethere/62/head -> origin/gh/seemethere/62/head 2025-09-07T07:38:46.4657915Z * [new branch] gh/seemethere/62/orig -> origin/gh/seemethere/62/orig 2025-09-07T07:38:46.4658563Z * [new branch] gh/seemethere/63/base -> origin/gh/seemethere/63/base 2025-09-07T07:38:46.4659145Z * [new branch] gh/seemethere/63/head -> origin/gh/seemethere/63/head 2025-09-07T07:38:46.4659587Z * [new branch] gh/seemethere/63/orig -> origin/gh/seemethere/63/orig 2025-09-07T07:38:46.4660732Z * [new branch] gh/shunting314/145/base -> origin/gh/shunting314/145/base 2025-09-07T07:38:46.4661308Z * [new branch] gh/shunting314/145/head -> origin/gh/shunting314/145/head 2025-09-07T07:38:46.4661902Z * [new branch] gh/shunting314/145/orig -> origin/gh/shunting314/145/orig 2025-09-07T07:38:46.4662796Z * [new branch] gh/shunting314/176/base -> origin/gh/shunting314/176/base 2025-09-07T07:38:46.4663467Z * [new branch] gh/shunting314/176/head -> origin/gh/shunting314/176/head 2025-09-07T07:38:46.4663908Z * [new branch] gh/shunting314/176/orig -> origin/gh/shunting314/176/orig 2025-09-07T07:38:46.4664708Z * [new branch] gh/shunting314/211/base -> origin/gh/shunting314/211/base 2025-09-07T07:38:46.4665394Z * [new branch] gh/shunting314/211/head -> origin/gh/shunting314/211/head 2025-09-07T07:38:46.4665845Z * [new branch] gh/shunting314/211/orig -> origin/gh/shunting314/211/orig 2025-09-07T07:38:46.4666671Z * [new branch] gh/shunting314/212/base -> origin/gh/shunting314/212/base 2025-09-07T07:38:46.4667484Z * [new branch] gh/shunting314/212/head -> origin/gh/shunting314/212/head 2025-09-07T07:38:46.4667945Z * [new branch] gh/shunting314/212/orig -> origin/gh/shunting314/212/orig 2025-09-07T07:38:46.4668715Z * [new branch] gh/shunting314/213/base -> origin/gh/shunting314/213/base 2025-09-07T07:38:46.4669294Z * [new branch] gh/shunting314/213/head -> origin/gh/shunting314/213/head 2025-09-07T07:38:46.4669709Z * [new branch] gh/shunting314/213/orig -> origin/gh/shunting314/213/orig 2025-09-07T07:38:46.4670488Z * [new branch] gh/shunting314/214/base -> origin/gh/shunting314/214/base 2025-09-07T07:38:46.4670900Z * [new branch] gh/shunting314/214/head -> origin/gh/shunting314/214/head 2025-09-07T07:38:46.4671469Z * [new branch] gh/shunting314/214/orig -> origin/gh/shunting314/214/orig 2025-09-07T07:38:46.4672274Z * [new branch] gh/shunting314/215/base -> origin/gh/shunting314/215/base 2025-09-07T07:38:46.4672736Z * [new branch] gh/shunting314/215/head -> origin/gh/shunting314/215/head 2025-09-07T07:38:46.4673324Z * [new branch] gh/shunting314/215/orig -> origin/gh/shunting314/215/orig 2025-09-07T07:38:46.4674164Z * [new branch] gh/shunting314/216/base -> origin/gh/shunting314/216/base 2025-09-07T07:38:46.4674571Z * [new branch] gh/shunting314/216/head -> origin/gh/shunting314/216/head 2025-09-07T07:38:46.4675151Z * [new branch] gh/shunting314/216/orig -> origin/gh/shunting314/216/orig 2025-09-07T07:38:46.4675926Z * [new branch] gh/shunting314/217/base -> origin/gh/shunting314/217/base 2025-09-07T07:38:46.4676313Z * [new branch] gh/shunting314/217/head -> origin/gh/shunting314/217/head 2025-09-07T07:38:46.4676905Z * [new branch] gh/shunting314/217/orig -> origin/gh/shunting314/217/orig 2025-09-07T07:38:46.4677709Z * [new branch] gh/shunting314/218/base -> origin/gh/shunting314/218/base 2025-09-07T07:38:46.4678099Z * [new branch] gh/shunting314/218/head -> origin/gh/shunting314/218/head 2025-09-07T07:38:46.4678674Z * [new branch] gh/shunting314/218/orig -> origin/gh/shunting314/218/orig 2025-09-07T07:38:46.4679296Z * [new branch] gh/shunting314/219/base -> origin/gh/shunting314/219/base 2025-09-07T07:38:46.4679706Z * [new branch] gh/shunting314/219/head -> origin/gh/shunting314/219/head 2025-09-07T07:38:46.4680292Z * [new branch] gh/shunting314/219/orig -> origin/gh/shunting314/219/orig 2025-09-07T07:38:46.4681252Z * [new branch] gh/shunting314/220/base -> origin/gh/shunting314/220/base 2025-09-07T07:38:46.4681918Z * [new branch] gh/shunting314/220/head -> origin/gh/shunting314/220/head 2025-09-07T07:38:46.4682507Z * [new branch] gh/shunting314/220/orig -> origin/gh/shunting314/220/orig 2025-09-07T07:38:46.4683282Z * [new branch] gh/shunting314/221/base -> origin/gh/shunting314/221/base 2025-09-07T07:38:46.4683683Z * [new branch] gh/shunting314/221/head -> origin/gh/shunting314/221/head 2025-09-07T07:38:46.4684266Z * [new branch] gh/shunting314/221/orig -> origin/gh/shunting314/221/orig 2025-09-07T07:38:46.4684940Z * [new branch] gh/shunting314/222/base -> origin/gh/shunting314/222/base 2025-09-07T07:38:46.4685326Z * [new branch] gh/shunting314/222/head -> origin/gh/shunting314/222/head 2025-09-07T07:38:46.4686252Z * [new branch] gh/shunting314/222/orig -> origin/gh/shunting314/222/orig 2025-09-07T07:38:46.4686885Z * [new branch] gh/shunting314/223/base -> origin/gh/shunting314/223/base 2025-09-07T07:38:46.4687296Z * [new branch] gh/shunting314/223/head -> origin/gh/shunting314/223/head 2025-09-07T07:38:46.4687880Z * [new branch] gh/shunting314/223/orig -> origin/gh/shunting314/223/orig 2025-09-07T07:38:46.4688912Z * [new branch] gh/silverguo/1/base -> origin/gh/silverguo/1/base 2025-09-07T07:38:46.4689373Z * [new branch] gh/silverguo/1/head -> origin/gh/silverguo/1/head 2025-09-07T07:38:46.4690445Z * [new branch] gh/silverguo/2/base -> origin/gh/silverguo/2/base 2025-09-07T07:38:46.4690795Z * [new branch] gh/silverguo/2/head -> origin/gh/silverguo/2/head 2025-09-07T07:38:46.4691513Z * [new branch] gh/silverguo/3/base -> origin/gh/silverguo/3/base 2025-09-07T07:38:46.4691937Z * [new branch] gh/silverguo/3/head -> origin/gh/silverguo/3/head 2025-09-07T07:38:46.4692758Z * [new branch] gh/silverguo/4/base -> origin/gh/silverguo/4/base 2025-09-07T07:38:46.4693232Z * [new branch] gh/silverguo/4/head -> origin/gh/silverguo/4/head 2025-09-07T07:38:46.4694168Z * [new branch] gh/sinhaanhsul/1/base -> origin/gh/sinhaanhsul/1/base 2025-09-07T07:38:46.4694741Z * [new branch] gh/sinhaanhsul/1/head -> origin/gh/sinhaanhsul/1/head 2025-09-07T07:38:46.4695659Z * [new branch] gh/skarjala/17/base -> origin/gh/skarjala/17/base 2025-09-07T07:38:46.4696068Z * [new branch] gh/skarjala/17/head -> origin/gh/skarjala/17/head 2025-09-07T07:38:46.4696683Z * [new branch] gh/skarjala/17/orig -> origin/gh/skarjala/17/orig 2025-09-07T07:38:46.4697388Z * [new branch] gh/skarjala/18/base -> origin/gh/skarjala/18/base 2025-09-07T07:38:46.4697860Z * [new branch] gh/skarjala/18/head -> origin/gh/skarjala/18/head 2025-09-07T07:38:46.4698422Z * [new branch] gh/skarjala/18/orig -> origin/gh/skarjala/18/orig 2025-09-07T07:38:46.4699129Z * [new branch] gh/skarjala/19/base -> origin/gh/skarjala/19/base 2025-09-07T07:38:46.4699529Z * [new branch] gh/skarjala/19/head -> origin/gh/skarjala/19/head 2025-09-07T07:38:46.4700195Z * [new branch] gh/skarjala/19/orig -> origin/gh/skarjala/19/orig 2025-09-07T07:38:46.4701074Z * [new branch] gh/slayton58/1/base -> origin/gh/slayton58/1/base 2025-09-07T07:38:46.4701635Z * [new branch] gh/slayton58/1/head -> origin/gh/slayton58/1/head 2025-09-07T07:38:46.4702252Z * [new branch] gh/slayton58/1/orig -> origin/gh/slayton58/1/orig 2025-09-07T07:38:46.4702928Z * [new branch] gh/slayton58/2/base -> origin/gh/slayton58/2/base 2025-09-07T07:38:46.4703359Z * [new branch] gh/slayton58/2/head -> origin/gh/slayton58/2/head 2025-09-07T07:38:46.4704268Z * [new branch] gh/slayton58/2/orig -> origin/gh/slayton58/2/orig 2025-09-07T07:38:46.4705247Z * [new branch] gh/slayton58/3/base -> origin/gh/slayton58/3/base 2025-09-07T07:38:46.4705646Z * [new branch] gh/slayton58/3/head -> origin/gh/slayton58/3/head 2025-09-07T07:38:46.4706267Z * [new branch] gh/slayton58/3/orig -> origin/gh/slayton58/3/orig 2025-09-07T07:38:46.4706899Z * [new branch] gh/slayton58/4/base -> origin/gh/slayton58/4/base 2025-09-07T07:38:46.4707331Z * [new branch] gh/slayton58/4/head -> origin/gh/slayton58/4/head 2025-09-07T07:38:46.4707935Z * [new branch] gh/slayton58/4/orig -> origin/gh/slayton58/4/orig 2025-09-07T07:38:46.4708613Z * [new branch] gh/slayton58/5/base -> origin/gh/slayton58/5/base 2025-09-07T07:38:46.4709131Z * [new branch] gh/slayton58/5/head -> origin/gh/slayton58/5/head 2025-09-07T07:38:46.4709635Z * [new branch] gh/slayton58/5/orig -> origin/gh/slayton58/5/orig 2025-09-07T07:38:46.4710932Z * [new branch] gh/soulitzer/269/base -> origin/gh/soulitzer/269/base 2025-09-07T07:38:46.4711313Z * [new branch] gh/soulitzer/269/head -> origin/gh/soulitzer/269/head 2025-09-07T07:38:46.4712068Z * [new branch] gh/soulitzer/269/orig -> origin/gh/soulitzer/269/orig 2025-09-07T07:38:46.4712832Z * [new branch] gh/soulitzer/276/base -> origin/gh/soulitzer/276/base 2025-09-07T07:38:46.4713282Z * [new branch] gh/soulitzer/276/head -> origin/gh/soulitzer/276/head 2025-09-07T07:38:46.4713859Z * [new branch] gh/soulitzer/276/orig -> origin/gh/soulitzer/276/orig 2025-09-07T07:38:46.4714777Z * [new branch] gh/soulitzer/287/base -> origin/gh/soulitzer/287/base 2025-09-07T07:38:46.4715187Z * [new branch] gh/soulitzer/287/head -> origin/gh/soulitzer/287/head 2025-09-07T07:38:46.4715792Z * [new branch] gh/soulitzer/287/orig -> origin/gh/soulitzer/287/orig 2025-09-07T07:38:46.4716978Z * [new branch] gh/soulitzer/296/base -> origin/gh/soulitzer/296/base 2025-09-07T07:38:46.4717438Z * [new branch] gh/soulitzer/296/head -> origin/gh/soulitzer/296/head 2025-09-07T07:38:46.4718063Z * [new branch] gh/soulitzer/296/orig -> origin/gh/soulitzer/296/orig 2025-09-07T07:38:46.4718840Z * [new branch] gh/soulitzer/299/base -> origin/gh/soulitzer/299/base 2025-09-07T07:38:46.4719421Z * [new branch] gh/soulitzer/299/head -> origin/gh/soulitzer/299/head 2025-09-07T07:38:46.4719879Z * [new branch] gh/soulitzer/299/orig -> origin/gh/soulitzer/299/orig 2025-09-07T07:38:46.4720746Z * [new branch] gh/soulitzer/300/base -> origin/gh/soulitzer/300/base 2025-09-07T07:38:46.4721333Z * [new branch] gh/soulitzer/300/head -> origin/gh/soulitzer/300/head 2025-09-07T07:38:46.4721776Z * [new branch] gh/soulitzer/300/orig -> origin/gh/soulitzer/300/orig 2025-09-07T07:38:46.4722667Z * [new branch] gh/soulitzer/301/base -> origin/gh/soulitzer/301/base 2025-09-07T07:38:46.4723270Z * [new branch] gh/soulitzer/301/head -> origin/gh/soulitzer/301/head 2025-09-07T07:38:46.4723628Z * [new branch] gh/soulitzer/301/orig -> origin/gh/soulitzer/301/orig 2025-09-07T07:38:46.4724491Z * [new branch] gh/soulitzer/313/base -> origin/gh/soulitzer/313/base 2025-09-07T07:38:46.4724945Z * [new branch] gh/soulitzer/313/head -> origin/gh/soulitzer/313/head 2025-09-07T07:38:46.4725536Z * [new branch] gh/soulitzer/313/orig -> origin/gh/soulitzer/313/orig 2025-09-07T07:38:46.4726220Z * [new branch] gh/soulitzer/319/base -> origin/gh/soulitzer/319/base 2025-09-07T07:38:46.4726642Z * [new branch] gh/soulitzer/319/head -> origin/gh/soulitzer/319/head 2025-09-07T07:38:46.4727228Z * [new branch] gh/soulitzer/319/orig -> origin/gh/soulitzer/319/orig 2025-09-07T07:38:46.4728003Z * [new branch] gh/soulitzer/320/base -> origin/gh/soulitzer/320/base 2025-09-07T07:38:46.4728390Z * [new branch] gh/soulitzer/320/head -> origin/gh/soulitzer/320/head 2025-09-07T07:38:46.4729012Z * [new branch] gh/soulitzer/320/orig -> origin/gh/soulitzer/320/orig 2025-09-07T07:38:46.4729782Z * [new branch] gh/soulitzer/336/base -> origin/gh/soulitzer/336/base 2025-09-07T07:38:46.4730256Z * [new branch] gh/soulitzer/336/head -> origin/gh/soulitzer/336/head 2025-09-07T07:38:46.4730841Z * [new branch] gh/soulitzer/336/orig -> origin/gh/soulitzer/336/orig 2025-09-07T07:38:46.4731662Z * [new branch] gh/soulitzer/347/base -> origin/gh/soulitzer/347/base 2025-09-07T07:38:46.4732042Z * [new branch] gh/soulitzer/347/head -> origin/gh/soulitzer/347/head 2025-09-07T07:38:46.4732624Z * [new branch] gh/soulitzer/347/orig -> origin/gh/soulitzer/347/orig 2025-09-07T07:38:46.4733476Z * [new branch] gh/soulitzer/349/base -> origin/gh/soulitzer/349/base 2025-09-07T07:38:46.4733908Z * [new branch] gh/soulitzer/349/head -> origin/gh/soulitzer/349/head 2025-09-07T07:38:46.4734516Z * [new branch] gh/soulitzer/349/orig -> origin/gh/soulitzer/349/orig 2025-09-07T07:38:46.4735201Z * [new branch] gh/soulitzer/350/base -> origin/gh/soulitzer/350/base 2025-09-07T07:38:46.4735634Z * [new branch] gh/soulitzer/350/head -> origin/gh/soulitzer/350/head 2025-09-07T07:38:46.4736219Z * [new branch] gh/soulitzer/350/orig -> origin/gh/soulitzer/350/orig 2025-09-07T07:38:46.4737007Z * [new branch] gh/soulitzer/351/base -> origin/gh/soulitzer/351/base 2025-09-07T07:38:46.4737430Z * [new branch] gh/soulitzer/351/head -> origin/gh/soulitzer/351/head 2025-09-07T07:38:46.4738008Z * [new branch] gh/soulitzer/351/orig -> origin/gh/soulitzer/351/orig 2025-09-07T07:38:46.4738767Z * [new branch] gh/soulitzer/353/base -> origin/gh/soulitzer/353/base 2025-09-07T07:38:46.4739479Z * [new branch] gh/soulitzer/353/head -> origin/gh/soulitzer/353/head 2025-09-07T07:38:46.4739925Z * [new branch] gh/soulitzer/353/orig -> origin/gh/soulitzer/353/orig 2025-09-07T07:38:46.4741138Z * [new branch] gh/soulitzer/358/base -> origin/gh/soulitzer/358/base 2025-09-07T07:38:46.4741590Z * [new branch] gh/soulitzer/358/head -> origin/gh/soulitzer/358/head 2025-09-07T07:38:46.4742230Z * [new branch] gh/soulitzer/358/orig -> origin/gh/soulitzer/358/orig 2025-09-07T07:38:46.4743236Z * [new branch] gh/soulitzer/359/base -> origin/gh/soulitzer/359/base 2025-09-07T07:38:46.4744053Z * [new branch] gh/soulitzer/359/head -> origin/gh/soulitzer/359/head 2025-09-07T07:38:46.4744495Z * [new branch] gh/soulitzer/359/orig -> origin/gh/soulitzer/359/orig 2025-09-07T07:38:46.4745318Z * [new branch] gh/soulitzer/362/base -> origin/gh/soulitzer/362/base 2025-09-07T07:38:46.4745761Z * [new branch] gh/soulitzer/362/head -> origin/gh/soulitzer/362/head 2025-09-07T07:38:46.4746353Z * [new branch] gh/soulitzer/362/orig -> origin/gh/soulitzer/362/orig 2025-09-07T07:38:46.4747150Z * [new branch] gh/soulitzer/372/base -> origin/gh/soulitzer/372/base 2025-09-07T07:38:46.4747546Z * [new branch] gh/soulitzer/372/head -> origin/gh/soulitzer/372/head 2025-09-07T07:38:46.4748141Z * [new branch] gh/soulitzer/372/orig -> origin/gh/soulitzer/372/orig 2025-09-07T07:38:46.4749197Z * [new branch] gh/soulitzer/373/base -> origin/gh/soulitzer/373/base 2025-09-07T07:38:46.4749853Z * [new branch] gh/soulitzer/373/head -> origin/gh/soulitzer/373/head 2025-09-07T07:38:46.4750266Z * [new branch] gh/soulitzer/373/orig -> origin/gh/soulitzer/373/orig 2025-09-07T07:38:46.4751135Z * [new branch] gh/soulitzer/374/base -> origin/gh/soulitzer/374/base 2025-09-07T07:38:46.4751537Z * [new branch] gh/soulitzer/374/head -> origin/gh/soulitzer/374/head 2025-09-07T07:38:46.4752167Z * [new branch] gh/soulitzer/374/orig -> origin/gh/soulitzer/374/orig 2025-09-07T07:38:46.4752905Z * [new branch] gh/soulitzer/375/base -> origin/gh/soulitzer/375/base 2025-09-07T07:38:46.4753286Z * [new branch] gh/soulitzer/375/head -> origin/gh/soulitzer/375/head 2025-09-07T07:38:46.4753867Z * [new branch] gh/soulitzer/375/orig -> origin/gh/soulitzer/375/orig 2025-09-07T07:38:46.4754675Z * [new branch] gh/soulitzer/376/base -> origin/gh/soulitzer/376/base 2025-09-07T07:38:46.4755087Z * [new branch] gh/soulitzer/376/head -> origin/gh/soulitzer/376/head 2025-09-07T07:38:46.4755658Z * [new branch] gh/soulitzer/376/orig -> origin/gh/soulitzer/376/orig 2025-09-07T07:38:46.4756700Z * [new branch] gh/soulitzer/377/base -> origin/gh/soulitzer/377/base 2025-09-07T07:38:46.4757044Z * [new branch] gh/soulitzer/377/head -> origin/gh/soulitzer/377/head 2025-09-07T07:38:46.4757699Z * [new branch] gh/soulitzer/377/orig -> origin/gh/soulitzer/377/orig 2025-09-07T07:38:46.4758441Z * [new branch] gh/soulitzer/378/base -> origin/gh/soulitzer/378/base 2025-09-07T07:38:46.4759097Z * [new branch] gh/soulitzer/378/head -> origin/gh/soulitzer/378/head 2025-09-07T07:38:46.4759529Z * [new branch] gh/soulitzer/378/orig -> origin/gh/soulitzer/378/orig 2025-09-07T07:38:46.4760327Z * [new branch] gh/soulitzer/379/base -> origin/gh/soulitzer/379/base 2025-09-07T07:38:46.4760743Z * [new branch] gh/soulitzer/379/head -> origin/gh/soulitzer/379/head 2025-09-07T07:38:46.4761344Z * [new branch] gh/soulitzer/379/orig -> origin/gh/soulitzer/379/orig 2025-09-07T07:38:46.4762338Z * [new branch] gh/swolchok/728/next -> origin/gh/swolchok/728/next 2025-09-07T07:38:46.4763333Z * [new branch] gh/swolchok/767/base -> origin/gh/swolchok/767/base 2025-09-07T07:38:46.4763996Z * [new branch] gh/swolchok/767/head -> origin/gh/swolchok/767/head 2025-09-07T07:38:46.4764697Z * [new branch] gh/swolchok/767/orig -> origin/gh/swolchok/767/orig 2025-09-07T07:38:46.4765464Z * [new branch] gh/swolchok/768/base -> origin/gh/swolchok/768/base 2025-09-07T07:38:46.4766031Z * [new branch] gh/swolchok/768/head -> origin/gh/swolchok/768/head 2025-09-07T07:38:46.4766563Z * [new branch] gh/swolchok/768/orig -> origin/gh/swolchok/768/orig 2025-09-07T07:38:46.4767455Z * [new branch] gh/swolchok/769/base -> origin/gh/swolchok/769/base 2025-09-07T07:38:46.4767881Z * [new branch] gh/swolchok/769/head -> origin/gh/swolchok/769/head 2025-09-07T07:38:46.4768709Z * [new branch] gh/swolchok/769/orig -> origin/gh/swolchok/769/orig 2025-09-07T07:38:46.4769449Z * [new branch] gh/swolchok/771/base -> origin/gh/swolchok/771/base 2025-09-07T07:38:46.4770148Z * [new branch] gh/swolchok/771/head -> origin/gh/swolchok/771/head 2025-09-07T07:38:46.4770596Z * [new branch] gh/swolchok/771/orig -> origin/gh/swolchok/771/orig 2025-09-07T07:38:46.4771399Z * [new branch] gh/swolchok/772/base -> origin/gh/swolchok/772/base 2025-09-07T07:38:46.4771962Z * [new branch] gh/swolchok/772/head -> origin/gh/swolchok/772/head 2025-09-07T07:38:46.4772416Z * [new branch] gh/swolchok/772/orig -> origin/gh/swolchok/772/orig 2025-09-07T07:38:46.4773909Z * [new branch] gh/swolchok/773/base -> origin/gh/swolchok/773/base 2025-09-07T07:38:46.4774499Z * [new branch] gh/swolchok/773/head -> origin/gh/swolchok/773/head 2025-09-07T07:38:46.4775067Z * [new branch] gh/swolchok/773/orig -> origin/gh/swolchok/773/orig 2025-09-07T07:38:46.4775829Z * [new branch] gh/swolchok/786/base -> origin/gh/swolchok/786/base 2025-09-07T07:38:46.4776202Z * [new branch] gh/swolchok/786/head -> origin/gh/swolchok/786/head 2025-09-07T07:38:46.4776811Z * [new branch] gh/swolchok/786/orig -> origin/gh/swolchok/786/orig 2025-09-07T07:38:46.4777463Z * [new branch] gh/swolchok/787/base -> origin/gh/swolchok/787/base 2025-09-07T07:38:46.4778029Z * [new branch] gh/swolchok/787/head -> origin/gh/swolchok/787/head 2025-09-07T07:38:46.4778654Z * [new branch] gh/swolchok/787/orig -> origin/gh/swolchok/787/orig 2025-09-07T07:38:46.4779367Z * [new branch] gh/swolchok/788/base -> origin/gh/swolchok/788/base 2025-09-07T07:38:46.4779797Z * [new branch] gh/swolchok/788/head -> origin/gh/swolchok/788/head 2025-09-07T07:38:46.4780397Z * [new branch] gh/swolchok/788/orig -> origin/gh/swolchok/788/orig 2025-09-07T07:38:46.4781090Z * [new branch] gh/swolchok/789/base -> origin/gh/swolchok/789/base 2025-09-07T07:38:46.4781547Z * [new branch] gh/swolchok/789/head -> origin/gh/swolchok/789/head 2025-09-07T07:38:46.4782189Z * [new branch] gh/swolchok/789/orig -> origin/gh/swolchok/789/orig 2025-09-07T07:38:46.4782883Z * [new branch] gh/swolchok/790/base -> origin/gh/swolchok/790/base 2025-09-07T07:38:46.4783623Z * [new branch] gh/swolchok/790/head -> origin/gh/swolchok/790/head 2025-09-07T07:38:46.4784053Z * [new branch] gh/swolchok/790/orig -> origin/gh/swolchok/790/orig 2025-09-07T07:38:46.4784908Z * [new branch] gh/swolchok/791/base -> origin/gh/swolchok/791/base 2025-09-07T07:38:46.4785371Z * [new branch] gh/swolchok/791/head -> origin/gh/swolchok/791/head 2025-09-07T07:38:46.4785984Z * [new branch] gh/swolchok/791/orig -> origin/gh/swolchok/791/orig 2025-09-07T07:38:46.4786714Z * [new branch] gh/swolchok/792/base -> origin/gh/swolchok/792/base 2025-09-07T07:38:46.4787151Z * [new branch] gh/swolchok/792/head -> origin/gh/swolchok/792/head 2025-09-07T07:38:46.4788182Z * [new branch] gh/swolchok/792/orig -> origin/gh/swolchok/792/orig 2025-09-07T07:38:46.4788921Z * [new branch] gh/swolchok/793/base -> origin/gh/swolchok/793/base 2025-09-07T07:38:46.4789319Z * [new branch] gh/swolchok/793/head -> origin/gh/swolchok/793/head 2025-09-07T07:38:46.4789929Z * [new branch] gh/swolchok/793/orig -> origin/gh/swolchok/793/orig 2025-09-07T07:38:46.4790690Z * [new branch] gh/swolchok/794/base -> origin/gh/swolchok/794/base 2025-09-07T07:38:46.4791088Z * [new branch] gh/swolchok/794/head -> origin/gh/swolchok/794/head 2025-09-07T07:38:46.4791649Z * [new branch] gh/swolchok/794/orig -> origin/gh/swolchok/794/orig 2025-09-07T07:38:46.4792675Z * [new branch] gh/swolchok/795/base -> origin/gh/swolchok/795/base 2025-09-07T07:38:46.4793246Z * [new branch] gh/swolchok/795/head -> origin/gh/swolchok/795/head 2025-09-07T07:38:46.4793680Z * [new branch] gh/swolchok/795/orig -> origin/gh/swolchok/795/orig 2025-09-07T07:38:46.4794522Z * [new branch] gh/swolchok/796/base -> origin/gh/swolchok/796/base 2025-09-07T07:38:46.4795104Z * [new branch] gh/swolchok/796/head -> origin/gh/swolchok/796/head 2025-09-07T07:38:46.4795541Z * [new branch] gh/swolchok/796/orig -> origin/gh/swolchok/796/orig 2025-09-07T07:38:46.4796526Z * [new branch] gh/swolchok/797/base -> origin/gh/swolchok/797/base 2025-09-07T07:38:46.4796977Z * [new branch] gh/swolchok/797/head -> origin/gh/swolchok/797/head 2025-09-07T07:38:46.4797679Z * [new branch] gh/swolchok/797/orig -> origin/gh/swolchok/797/orig 2025-09-07T07:38:46.4798493Z * [new branch] gh/swolchok/798/base -> origin/gh/swolchok/798/base 2025-09-07T07:38:46.4798864Z * [new branch] gh/swolchok/798/head -> origin/gh/swolchok/798/head 2025-09-07T07:38:46.4799530Z * [new branch] gh/swolchok/798/orig -> origin/gh/swolchok/798/orig 2025-09-07T07:38:46.4800399Z * [new branch] gh/swolchok/799/base -> origin/gh/swolchok/799/base 2025-09-07T07:38:46.4800871Z * [new branch] gh/swolchok/799/head -> origin/gh/swolchok/799/head 2025-09-07T07:38:46.4801551Z * [new branch] gh/swolchok/799/orig -> origin/gh/swolchok/799/orig 2025-09-07T07:38:46.4802362Z * [new branch] gh/swolchok/800/base -> origin/gh/swolchok/800/base 2025-09-07T07:38:46.4802734Z * [new branch] gh/swolchok/800/head -> origin/gh/swolchok/800/head 2025-09-07T07:38:46.4803411Z * [new branch] gh/swolchok/800/orig -> origin/gh/swolchok/800/orig 2025-09-07T07:38:46.4804214Z * [new branch] gh/swolchok/801/base -> origin/gh/swolchok/801/base 2025-09-07T07:38:46.4804605Z * [new branch] gh/swolchok/801/head -> origin/gh/swolchok/801/head 2025-09-07T07:38:46.4805275Z * [new branch] gh/swolchok/801/orig -> origin/gh/swolchok/801/orig 2025-09-07T07:38:46.4806035Z * [new branch] gh/swolchok/802/base -> origin/gh/swolchok/802/base 2025-09-07T07:38:46.4806435Z * [new branch] gh/swolchok/802/head -> origin/gh/swolchok/802/head 2025-09-07T07:38:46.4807135Z * [new branch] gh/swolchok/802/orig -> origin/gh/swolchok/802/orig 2025-09-07T07:38:46.4807875Z * [new branch] gh/swolchok/803/base -> origin/gh/swolchok/803/base 2025-09-07T07:38:46.4808345Z * [new branch] gh/swolchok/803/head -> origin/gh/swolchok/803/head 2025-09-07T07:38:46.4808979Z * [new branch] gh/swolchok/803/orig -> origin/gh/swolchok/803/orig 2025-09-07T07:38:46.4809844Z * [new branch] gh/swolchok/804/base -> origin/gh/swolchok/804/base 2025-09-07T07:38:46.4810270Z * [new branch] gh/swolchok/804/head -> origin/gh/swolchok/804/head 2025-09-07T07:38:46.4810841Z * [new branch] gh/swolchok/804/orig -> origin/gh/swolchok/804/orig 2025-09-07T07:38:46.4811584Z * [new branch] gh/swolchok/805/base -> origin/gh/swolchok/805/base 2025-09-07T07:38:46.4812032Z * [new branch] gh/swolchok/805/head -> origin/gh/swolchok/805/head 2025-09-07T07:38:46.4812612Z * [new branch] gh/swolchok/805/orig -> origin/gh/swolchok/805/orig 2025-09-07T07:38:46.4813249Z * [new branch] gh/swolchok/806/base -> origin/gh/swolchok/806/base 2025-09-07T07:38:46.4813699Z * [new branch] gh/swolchok/806/head -> origin/gh/swolchok/806/head 2025-09-07T07:38:46.4814270Z * [new branch] gh/swolchok/806/orig -> origin/gh/swolchok/806/orig 2025-09-07T07:38:46.4815380Z * [new branch] gh/swolchok/807/base -> origin/gh/swolchok/807/base 2025-09-07T07:38:46.4816100Z * [new branch] gh/swolchok/807/head -> origin/gh/swolchok/807/head 2025-09-07T07:38:46.4816745Z * [new branch] gh/swolchok/807/orig -> origin/gh/swolchok/807/orig 2025-09-07T07:38:46.4817600Z * [new branch] gh/swolchok/808/base -> origin/gh/swolchok/808/base 2025-09-07T07:38:46.4818053Z * [new branch] gh/swolchok/808/head -> origin/gh/swolchok/808/head 2025-09-07T07:38:46.4818626Z * [new branch] gh/swolchok/808/orig -> origin/gh/swolchok/808/orig 2025-09-07T07:38:46.4819370Z * [new branch] gh/swolchok/809/base -> origin/gh/swolchok/809/base 2025-09-07T07:38:46.4819834Z * [new branch] gh/swolchok/809/head -> origin/gh/swolchok/809/head 2025-09-07T07:38:46.4820456Z * [new branch] gh/swolchok/809/orig -> origin/gh/swolchok/809/orig 2025-09-07T07:38:46.4821385Z * [new branch] gh/swolchok/810/base -> origin/gh/swolchok/810/base 2025-09-07T07:38:46.4821768Z * [new branch] gh/swolchok/810/head -> origin/gh/swolchok/810/head 2025-09-07T07:38:46.4822248Z * [new branch] gh/swolchok/810/orig -> origin/gh/swolchok/810/orig 2025-09-07T07:38:46.4823124Z * [new branch] gh/swolchok/811/base -> origin/gh/swolchok/811/base 2025-09-07T07:38:46.4823736Z * [new branch] gh/swolchok/811/head -> origin/gh/swolchok/811/head 2025-09-07T07:38:46.4824195Z * [new branch] gh/swolchok/811/orig -> origin/gh/swolchok/811/orig 2025-09-07T07:38:46.4825091Z * [new branch] gh/swolchok/812/base -> origin/gh/swolchok/812/base 2025-09-07T07:38:46.4825734Z * [new branch] gh/swolchok/812/head -> origin/gh/swolchok/812/head 2025-09-07T07:38:46.4826097Z * [new branch] gh/swolchok/812/orig -> origin/gh/swolchok/812/orig 2025-09-07T07:38:46.4827011Z * [new branch] gh/swolchok/813/base -> origin/gh/swolchok/813/base 2025-09-07T07:38:46.4827375Z * [new branch] gh/swolchok/813/head -> origin/gh/swolchok/813/head 2025-09-07T07:38:46.4827988Z * [new branch] gh/swolchok/813/orig -> origin/gh/swolchok/813/orig 2025-09-07T07:38:46.4828735Z * [new branch] gh/swolchok/814/base -> origin/gh/swolchok/814/base 2025-09-07T07:38:46.4829130Z * [new branch] gh/swolchok/814/head -> origin/gh/swolchok/814/head 2025-09-07T07:38:46.4829753Z * [new branch] gh/swolchok/814/orig -> origin/gh/swolchok/814/orig 2025-09-07T07:38:46.4830504Z * [new branch] gh/swolchok/815/base -> origin/gh/swolchok/815/base 2025-09-07T07:38:46.4830960Z * [new branch] gh/swolchok/815/head -> origin/gh/swolchok/815/head 2025-09-07T07:38:46.4831584Z * [new branch] gh/swolchok/815/orig -> origin/gh/swolchok/815/orig 2025-09-07T07:38:46.4832301Z * [new branch] gh/swolchok/816/base -> origin/gh/swolchok/816/base 2025-09-07T07:38:46.4832880Z * [new branch] gh/swolchok/816/head -> origin/gh/swolchok/816/head 2025-09-07T07:38:46.4833287Z * [new branch] gh/swolchok/816/orig -> origin/gh/swolchok/816/orig 2025-09-07T07:38:46.4834515Z * [new branch] gh/swolchok/817/base -> origin/gh/swolchok/817/base 2025-09-07T07:38:46.4835105Z * [new branch] gh/swolchok/817/head -> origin/gh/swolchok/817/head 2025-09-07T07:38:46.4835496Z * [new branch] gh/swolchok/817/orig -> origin/gh/swolchok/817/orig 2025-09-07T07:38:46.4836371Z * [new branch] gh/swolchok/818/base -> origin/gh/swolchok/818/base 2025-09-07T07:38:46.4836721Z * [new branch] gh/swolchok/818/head -> origin/gh/swolchok/818/head 2025-09-07T07:38:46.4837294Z * [new branch] gh/swolchok/818/orig -> origin/gh/swolchok/818/orig 2025-09-07T07:38:46.4838192Z * [new branch] gh/swolchok/819/base -> origin/gh/swolchok/819/base 2025-09-07T07:38:46.4838650Z * [new branch] gh/swolchok/819/head -> origin/gh/swolchok/819/head 2025-09-07T07:38:46.4839258Z * [new branch] gh/swolchok/819/orig -> origin/gh/swolchok/819/orig 2025-09-07T07:38:46.4839974Z * [new branch] gh/swolchok/820/base -> origin/gh/swolchok/820/base 2025-09-07T07:38:46.4840368Z * [new branch] gh/swolchok/820/head -> origin/gh/swolchok/820/head 2025-09-07T07:38:46.4840829Z * [new branch] gh/swolchok/820/orig -> origin/gh/swolchok/820/orig 2025-09-07T07:38:46.4841692Z * [new branch] gh/swolchok/821/base -> origin/gh/swolchok/821/base 2025-09-07T07:38:46.4842059Z * [new branch] gh/swolchok/821/head -> origin/gh/swolchok/821/head 2025-09-07T07:38:46.4842623Z * [new branch] gh/swolchok/821/orig -> origin/gh/swolchok/821/orig 2025-09-07T07:38:46.4843473Z * [new branch] gh/swolchok/822/base -> origin/gh/swolchok/822/base 2025-09-07T07:38:46.4844043Z * [new branch] gh/swolchok/822/head -> origin/gh/swolchok/822/head 2025-09-07T07:38:46.4844476Z * [new branch] gh/swolchok/822/orig -> origin/gh/swolchok/822/orig 2025-09-07T07:38:46.4845377Z * [new branch] gh/swolchok/823/base -> origin/gh/swolchok/823/base 2025-09-07T07:38:46.4845839Z * [new branch] gh/swolchok/823/head -> origin/gh/swolchok/823/head 2025-09-07T07:38:46.4846426Z * [new branch] gh/swolchok/823/orig -> origin/gh/swolchok/823/orig 2025-09-07T07:38:46.4847074Z * [new branch] gh/swolchok/824/base -> origin/gh/swolchok/824/base 2025-09-07T07:38:46.4847481Z * [new branch] gh/swolchok/824/head -> origin/gh/swolchok/824/head 2025-09-07T07:38:46.4848126Z * [new branch] gh/swolchok/824/orig -> origin/gh/swolchok/824/orig 2025-09-07T07:38:46.4849132Z * [new branch] gh/swolchok/825/base -> origin/gh/swolchok/825/base 2025-09-07T07:38:46.4849698Z * [new branch] gh/swolchok/825/head -> origin/gh/swolchok/825/head 2025-09-07T07:38:46.4850145Z * [new branch] gh/swolchok/825/orig -> origin/gh/swolchok/825/orig 2025-09-07T07:38:46.4851005Z * [new branch] gh/swolchok/826/base -> origin/gh/swolchok/826/base 2025-09-07T07:38:46.4851389Z * [new branch] gh/swolchok/826/head -> origin/gh/swolchok/826/head 2025-09-07T07:38:46.4851829Z * [new branch] gh/swolchok/826/orig -> origin/gh/swolchok/826/orig 2025-09-07T07:38:46.4852723Z * [new branch] gh/swolchok/827/base -> origin/gh/swolchok/827/base 2025-09-07T07:38:46.4853539Z * [new branch] gh/swolchok/827/head -> origin/gh/swolchok/827/head 2025-09-07T07:38:46.4853764Z * [new branch] gh/swolchok/827/orig -> origin/gh/swolchok/827/orig 2025-09-07T07:38:46.4854676Z * [new branch] gh/swolchok/828/base -> origin/gh/swolchok/828/base 2025-09-07T07:38:46.4855039Z * [new branch] gh/swolchok/828/head -> origin/gh/swolchok/828/head 2025-09-07T07:38:46.4855609Z * [new branch] gh/swolchok/828/orig -> origin/gh/swolchok/828/orig 2025-09-07T07:38:46.4856247Z * [new branch] gh/swolchok/829/base -> origin/gh/swolchok/829/base 2025-09-07T07:38:46.4856659Z * [new branch] gh/swolchok/829/head -> origin/gh/swolchok/829/head 2025-09-07T07:38:46.4857243Z * [new branch] gh/swolchok/829/orig -> origin/gh/swolchok/829/orig 2025-09-07T07:38:46.4858051Z * [new branch] gh/swolchok/830/base -> origin/gh/swolchok/830/base 2025-09-07T07:38:46.4858474Z * [new branch] gh/swolchok/830/head -> origin/gh/swolchok/830/head 2025-09-07T07:38:46.4858890Z * [new branch] gh/swolchok/830/orig -> origin/gh/swolchok/830/orig 2025-09-07T07:38:46.4859648Z * [new branch] gh/swolchok/831/base -> origin/gh/swolchok/831/base 2025-09-07T07:38:46.4860254Z * [new branch] gh/swolchok/831/head -> origin/gh/swolchok/831/head 2025-09-07T07:38:46.4860736Z * [new branch] gh/swolchok/831/orig -> origin/gh/swolchok/831/orig 2025-09-07T07:38:46.4861467Z * [new branch] gh/swolchok/832/base -> origin/gh/swolchok/832/base 2025-09-07T07:38:46.4861987Z * [new branch] gh/swolchok/832/head -> origin/gh/swolchok/832/head 2025-09-07T07:38:46.4862392Z * [new branch] gh/swolchok/832/orig -> origin/gh/swolchok/832/orig 2025-09-07T07:38:46.4863364Z * [new branch] gh/syed-ahmed/3/base -> origin/gh/syed-ahmed/3/base 2025-09-07T07:38:46.4863816Z * [new branch] gh/syed-ahmed/3/head -> origin/gh/syed-ahmed/3/head 2025-09-07T07:38:46.4864378Z * [new branch] gh/syed-ahmed/3/orig -> origin/gh/syed-ahmed/3/orig 2025-09-07T07:38:46.4865069Z * [new branch] gh/syed-ahmed/4/base -> origin/gh/syed-ahmed/4/base 2025-09-07T07:38:46.4865478Z * [new branch] gh/syed-ahmed/4/head -> origin/gh/syed-ahmed/4/head 2025-09-07T07:38:46.4866357Z * [new branch] gh/syed-ahmed/4/orig -> origin/gh/syed-ahmed/4/orig 2025-09-07T07:38:46.4867119Z * [new branch] gh/syed-ahmed/5/base -> origin/gh/syed-ahmed/5/base 2025-09-07T07:38:46.4867536Z * [new branch] gh/syed-ahmed/5/head -> origin/gh/syed-ahmed/5/head 2025-09-07T07:38:46.4868111Z * [new branch] gh/syed-ahmed/5/orig -> origin/gh/syed-ahmed/5/orig 2025-09-07T07:38:46.4869083Z * [new branch] gh/teja-rao/4/base -> origin/gh/teja-rao/4/base 2025-09-07T07:38:46.4869690Z * [new branch] gh/teja-rao/4/head -> origin/gh/teja-rao/4/head 2025-09-07T07:38:46.4870068Z * [new branch] gh/teja-rao/4/orig -> origin/gh/teja-rao/4/orig 2025-09-07T07:38:46.4871163Z * [new branch] gh/tianyu-l/2/base -> origin/gh/tianyu-l/2/base 2025-09-07T07:38:46.4871613Z * [new branch] gh/tianyu-l/2/head -> origin/gh/tianyu-l/2/head 2025-09-07T07:38:46.4872187Z * [new branch] gh/tianyu-l/2/orig -> origin/gh/tianyu-l/2/orig 2025-09-07T07:38:46.4872921Z * [new branch] gh/tianyu-l/3/base -> origin/gh/tianyu-l/3/base 2025-09-07T07:38:46.4873364Z * [new branch] gh/tianyu-l/3/head -> origin/gh/tianyu-l/3/head 2025-09-07T07:38:46.4873993Z * [new branch] gh/tianyu-l/3/orig -> origin/gh/tianyu-l/3/orig 2025-09-07T07:38:46.4874681Z * [new branch] gh/tianyu-l/4/base -> origin/gh/tianyu-l/4/base 2025-09-07T07:38:46.4875112Z * [new branch] gh/tianyu-l/4/head -> origin/gh/tianyu-l/4/head 2025-09-07T07:38:46.4875576Z * [new branch] gh/tianyu-l/4/orig -> origin/gh/tianyu-l/4/orig 2025-09-07T07:38:46.4876706Z * [new branch] gh/tugsbayasgalan/1/base -> origin/gh/tugsbayasgalan/1/base 2025-09-07T07:38:46.4877089Z * [new branch] gh/tugsbayasgalan/1/head -> origin/gh/tugsbayasgalan/1/head 2025-09-07T07:38:46.4877773Z * [new branch] gh/tugsbayasgalan/1/orig -> origin/gh/tugsbayasgalan/1/orig 2025-09-07T07:38:46.4878709Z * [new branch] gh/tugsbayasgalan/10/base -> origin/gh/tugsbayasgalan/10/base 2025-09-07T07:38:46.4879155Z * [new branch] gh/tugsbayasgalan/10/head -> origin/gh/tugsbayasgalan/10/head 2025-09-07T07:38:46.4879680Z * [new branch] gh/tugsbayasgalan/10/orig -> origin/gh/tugsbayasgalan/10/orig 2025-09-07T07:38:46.4880500Z * [new branch] gh/tugsbayasgalan/11/base -> origin/gh/tugsbayasgalan/11/base 2025-09-07T07:38:46.4881170Z * [new branch] gh/tugsbayasgalan/11/head -> origin/gh/tugsbayasgalan/11/head 2025-09-07T07:38:46.4881578Z * [new branch] gh/tugsbayasgalan/11/orig -> origin/gh/tugsbayasgalan/11/orig 2025-09-07T07:38:46.4882415Z * [new branch] gh/tugsbayasgalan/12/base -> origin/gh/tugsbayasgalan/12/base 2025-09-07T07:38:46.4882803Z * [new branch] gh/tugsbayasgalan/12/head -> origin/gh/tugsbayasgalan/12/head 2025-09-07T07:38:46.4883403Z * [new branch] gh/tugsbayasgalan/12/orig -> origin/gh/tugsbayasgalan/12/orig 2025-09-07T07:38:46.4884138Z * [new branch] gh/tugsbayasgalan/13/base -> origin/gh/tugsbayasgalan/13/base 2025-09-07T07:38:46.4884552Z * [new branch] gh/tugsbayasgalan/13/head -> origin/gh/tugsbayasgalan/13/head 2025-09-07T07:38:46.4885143Z * [new branch] gh/tugsbayasgalan/13/orig -> origin/gh/tugsbayasgalan/13/orig 2025-09-07T07:38:46.4885962Z * [new branch] gh/tugsbayasgalan/14/base -> origin/gh/tugsbayasgalan/14/base 2025-09-07T07:38:46.4886359Z * [new branch] gh/tugsbayasgalan/14/head -> origin/gh/tugsbayasgalan/14/head 2025-09-07T07:38:46.4886828Z * [new branch] gh/tugsbayasgalan/14/orig -> origin/gh/tugsbayasgalan/14/orig 2025-09-07T07:38:46.4887772Z * [new branch] gh/tugsbayasgalan/15/base -> origin/gh/tugsbayasgalan/15/base 2025-09-07T07:38:46.4888258Z * [new branch] gh/tugsbayasgalan/15/head -> origin/gh/tugsbayasgalan/15/head 2025-09-07T07:38:46.4888729Z * [new branch] gh/tugsbayasgalan/15/orig -> origin/gh/tugsbayasgalan/15/orig 2025-09-07T07:38:46.4889746Z * [new branch] gh/tugsbayasgalan/2/base -> origin/gh/tugsbayasgalan/2/base 2025-09-07T07:38:46.4890154Z * [new branch] gh/tugsbayasgalan/2/head -> origin/gh/tugsbayasgalan/2/head 2025-09-07T07:38:46.4890753Z * [new branch] gh/tugsbayasgalan/2/orig -> origin/gh/tugsbayasgalan/2/orig 2025-09-07T07:38:46.4891380Z * [new branch] gh/tugsbayasgalan/3/base -> origin/gh/tugsbayasgalan/3/base 2025-09-07T07:38:46.4892332Z * [new branch] gh/tugsbayasgalan/3/head -> origin/gh/tugsbayasgalan/3/head 2025-09-07T07:38:46.4892781Z * [new branch] gh/tugsbayasgalan/3/orig -> origin/gh/tugsbayasgalan/3/orig 2025-09-07T07:38:46.4893590Z * [new branch] gh/tugsbayasgalan/4/base -> origin/gh/tugsbayasgalan/4/base 2025-09-07T07:38:46.4894250Z * [new branch] gh/tugsbayasgalan/4/head -> origin/gh/tugsbayasgalan/4/head 2025-09-07T07:38:46.4894680Z * [new branch] gh/tugsbayasgalan/4/orig -> origin/gh/tugsbayasgalan/4/orig 2025-09-07T07:38:46.4895565Z * [new branch] gh/tugsbayasgalan/5/base -> origin/gh/tugsbayasgalan/5/base 2025-09-07T07:38:46.4896251Z * [new branch] gh/tugsbayasgalan/5/head -> origin/gh/tugsbayasgalan/5/head 2025-09-07T07:38:46.4896642Z * [new branch] gh/tugsbayasgalan/5/orig -> origin/gh/tugsbayasgalan/5/orig 2025-09-07T07:38:46.4897375Z * [new branch] gh/tugsbayasgalan/6/base -> origin/gh/tugsbayasgalan/6/base 2025-09-07T07:38:46.4897785Z * [new branch] gh/tugsbayasgalan/6/head -> origin/gh/tugsbayasgalan/6/head 2025-09-07T07:38:46.4898386Z * [new branch] gh/tugsbayasgalan/6/orig -> origin/gh/tugsbayasgalan/6/orig 2025-09-07T07:38:46.4899276Z * [new branch] gh/tugsbayasgalan/7/base -> origin/gh/tugsbayasgalan/7/base 2025-09-07T07:38:46.4899683Z * [new branch] gh/tugsbayasgalan/7/head -> origin/gh/tugsbayasgalan/7/head 2025-09-07T07:38:46.4900292Z * [new branch] gh/tugsbayasgalan/7/orig -> origin/gh/tugsbayasgalan/7/orig 2025-09-07T07:38:46.4901012Z * [new branch] gh/tugsbayasgalan/8/base -> origin/gh/tugsbayasgalan/8/base 2025-09-07T07:38:46.4901818Z * [new branch] gh/tugsbayasgalan/8/head -> origin/gh/tugsbayasgalan/8/head 2025-09-07T07:38:46.4902228Z * [new branch] gh/tugsbayasgalan/8/orig -> origin/gh/tugsbayasgalan/8/orig 2025-09-07T07:38:46.4903007Z * [new branch] gh/tugsbayasgalan/9/base -> origin/gh/tugsbayasgalan/9/base 2025-09-07T07:38:46.4903415Z * [new branch] gh/tugsbayasgalan/9/head -> origin/gh/tugsbayasgalan/9/head 2025-09-07T07:38:46.4904017Z * [new branch] gh/tugsbayasgalan/9/orig -> origin/gh/tugsbayasgalan/9/orig 2025-09-07T07:38:46.4904970Z * [new branch] gh/v0i0/1/base -> origin/gh/v0i0/1/base 2025-09-07T07:38:46.4905390Z * [new branch] gh/v0i0/1/head -> origin/gh/v0i0/1/head 2025-09-07T07:38:46.4905912Z * [new branch] gh/v0i0/1/orig -> origin/gh/v0i0/1/orig 2025-09-07T07:38:46.4906710Z * [new branch] gh/v0i0/4/base -> origin/gh/v0i0/4/base 2025-09-07T07:38:46.4907138Z * [new branch] gh/v0i0/4/head -> origin/gh/v0i0/4/head 2025-09-07T07:38:46.4907859Z * [new branch] gh/v0i0/4/orig -> origin/gh/v0i0/4/orig 2025-09-07T07:38:46.4908592Z * [new branch] gh/v0i0/6/base -> origin/gh/v0i0/6/base 2025-09-07T07:38:46.4909013Z * [new branch] gh/v0i0/6/head -> origin/gh/v0i0/6/head 2025-09-07T07:38:46.4909611Z * [new branch] gh/v0i0/6/orig -> origin/gh/v0i0/6/orig 2025-09-07T07:38:46.4910448Z * [new branch] gh/v0i0/7/base -> origin/gh/v0i0/7/base 2025-09-07T07:38:46.4910962Z * [new branch] gh/v0i0/7/head -> origin/gh/v0i0/7/head 2025-09-07T07:38:46.4911372Z * [new branch] gh/v0i0/7/orig -> origin/gh/v0i0/7/orig 2025-09-07T07:38:46.4912128Z * [new branch] gh/v0i0/8/base -> origin/gh/v0i0/8/base 2025-09-07T07:38:46.4912510Z * [new branch] gh/v0i0/8/head -> origin/gh/v0i0/8/head 2025-09-07T07:38:46.4913935Z * [new branch] gh/v0i0/8/orig -> origin/gh/v0i0/8/orig 2025-09-07T07:38:46.4914079Z * [new branch] gh/v0i0/9/base -> origin/gh/v0i0/9/base 2025-09-07T07:38:46.4914241Z * [new branch] gh/v0i0/9/head -> origin/gh/v0i0/9/head 2025-09-07T07:38:46.4914837Z * [new branch] gh/v0i0/9/orig -> origin/gh/v0i0/9/orig 2025-09-07T07:38:46.4915680Z * [new branch] gh/vkuzo/1/next -> origin/gh/vkuzo/1/next 2025-09-07T07:38:46.4916367Z * [new branch] gh/vkuzo/2/next -> origin/gh/vkuzo/2/next 2025-09-07T07:38:46.4917221Z * [new branch] gh/vkuzo/3/next -> origin/gh/vkuzo/3/next 2025-09-07T07:38:46.4918071Z * [new branch] gh/vkuzo/4/base -> origin/gh/vkuzo/4/base 2025-09-07T07:38:46.4918781Z * [new branch] gh/vkuzo/4/head -> origin/gh/vkuzo/4/head 2025-09-07T07:38:46.4919193Z * [new branch] gh/vkuzo/4/orig -> origin/gh/vkuzo/4/orig 2025-09-07T07:38:46.4920092Z * [new branch] gh/vkuzo/5/base -> origin/gh/vkuzo/5/base 2025-09-07T07:38:46.4920709Z * [new branch] gh/vkuzo/5/head -> origin/gh/vkuzo/5/head 2025-09-07T07:38:46.4921150Z * [new branch] gh/vkuzo/5/orig -> origin/gh/vkuzo/5/orig 2025-09-07T07:38:46.4922119Z * [new branch] gh/vkuzo/6/base -> origin/gh/vkuzo/6/base 2025-09-07T07:38:46.4922486Z * [new branch] gh/vkuzo/6/head -> origin/gh/vkuzo/6/head 2025-09-07T07:38:46.4923089Z * [new branch] gh/vkuzo/6/orig -> origin/gh/vkuzo/6/orig 2025-09-07T07:38:46.4923709Z * [new branch] gh/vkuzo/7/base -> origin/gh/vkuzo/7/base 2025-09-07T07:38:46.4924321Z * [new branch] gh/vkuzo/7/head -> origin/gh/vkuzo/7/head 2025-09-07T07:38:46.4924891Z * [new branch] gh/vkuzo/7/orig -> origin/gh/vkuzo/7/orig 2025-09-07T07:38:46.4925901Z * [new branch] gh/wconstab/419/base -> origin/gh/wconstab/419/base 2025-09-07T07:38:46.4926291Z * [new branch] gh/wconstab/419/head -> origin/gh/wconstab/419/head 2025-09-07T07:38:46.4926992Z * [new branch] gh/wconstab/419/orig -> origin/gh/wconstab/419/orig 2025-09-07T07:38:46.4927788Z * [new branch] gh/wconstab/424/base -> origin/gh/wconstab/424/base 2025-09-07T07:38:46.4928160Z * [new branch] gh/wconstab/424/head -> origin/gh/wconstab/424/head 2025-09-07T07:38:46.4928737Z * [new branch] gh/wconstab/424/orig -> origin/gh/wconstab/424/orig 2025-09-07T07:38:46.4929462Z * [new branch] gh/wconstab/435/base -> origin/gh/wconstab/435/base 2025-09-07T07:38:46.4930060Z * [new branch] gh/wconstab/435/head -> origin/gh/wconstab/435/head 2025-09-07T07:38:46.4930424Z * [new branch] gh/wconstab/435/orig -> origin/gh/wconstab/435/orig 2025-09-07T07:38:46.4931290Z * [new branch] gh/wconstab/438/base -> origin/gh/wconstab/438/base 2025-09-07T07:38:46.4931722Z * [new branch] gh/wconstab/438/head -> origin/gh/wconstab/438/head 2025-09-07T07:38:46.4932239Z * [new branch] gh/wconstab/438/orig -> origin/gh/wconstab/438/orig 2025-09-07T07:38:46.4933043Z * [new branch] gh/wconstab/440/base -> origin/gh/wconstab/440/base 2025-09-07T07:38:46.4933657Z * [new branch] gh/wconstab/440/head -> origin/gh/wconstab/440/head 2025-09-07T07:38:46.4934238Z * [new branch] gh/wconstab/440/orig -> origin/gh/wconstab/440/orig 2025-09-07T07:38:46.4935063Z * [new branch] gh/wconstab/441/base -> origin/gh/wconstab/441/base 2025-09-07T07:38:46.4935463Z * [new branch] gh/wconstab/441/head -> origin/gh/wconstab/441/head 2025-09-07T07:38:46.4936236Z * [new branch] gh/wconstab/441/orig -> origin/gh/wconstab/441/orig 2025-09-07T07:38:46.4937101Z * [new branch] gh/wconstab/442/base -> origin/gh/wconstab/442/base 2025-09-07T07:38:46.4937703Z * [new branch] gh/wconstab/442/head -> origin/gh/wconstab/442/head 2025-09-07T07:38:46.4938157Z * [new branch] gh/wconstab/442/orig -> origin/gh/wconstab/442/orig 2025-09-07T07:38:46.4939332Z * [new branch] gh/wconstab/443/base -> origin/gh/wconstab/443/base 2025-09-07T07:38:46.4939742Z * [new branch] gh/wconstab/443/head -> origin/gh/wconstab/443/head 2025-09-07T07:38:46.4940357Z * [new branch] gh/wconstab/443/orig -> origin/gh/wconstab/443/orig 2025-09-07T07:38:46.4941298Z * [new branch] gh/wconstab/444/base -> origin/gh/wconstab/444/base 2025-09-07T07:38:46.4941566Z * [new branch] gh/wconstab/444/head -> origin/gh/wconstab/444/head 2025-09-07T07:38:46.4942232Z * [new branch] gh/wconstab/444/orig -> origin/gh/wconstab/444/orig 2025-09-07T07:38:46.4942929Z * [new branch] gh/wconstab/445/base -> origin/gh/wconstab/445/base 2025-09-07T07:38:46.4943362Z * [new branch] gh/wconstab/445/head -> origin/gh/wconstab/445/head 2025-09-07T07:38:46.4943947Z * [new branch] gh/wconstab/445/orig -> origin/gh/wconstab/445/orig 2025-09-07T07:38:46.4945114Z * [new branch] gh/wconstab/446/base -> origin/gh/wconstab/446/base 2025-09-07T07:38:46.4945761Z * [new branch] gh/wconstab/446/head -> origin/gh/wconstab/446/head 2025-09-07T07:38:46.4946698Z * [new branch] gh/wconstab/446/orig -> origin/gh/wconstab/446/orig 2025-09-07T07:38:46.4947509Z * [new branch] gh/wconstab/447/base -> origin/gh/wconstab/447/base 2025-09-07T07:38:46.4947966Z * [new branch] gh/wconstab/447/head -> origin/gh/wconstab/447/head 2025-09-07T07:38:46.4948554Z * [new branch] gh/wconstab/447/orig -> origin/gh/wconstab/447/orig 2025-09-07T07:38:46.4949546Z * [new branch] gh/weifengpy/27/base -> origin/gh/weifengpy/27/base 2025-09-07T07:38:46.4949973Z * [new branch] gh/weifengpy/27/head -> origin/gh/weifengpy/27/head 2025-09-07T07:38:46.4950562Z * [new branch] gh/weifengpy/27/orig -> origin/gh/weifengpy/27/orig 2025-09-07T07:38:46.4951317Z * [new branch] gh/weifengpy/30/base -> origin/gh/weifengpy/30/base 2025-09-07T07:38:46.4951710Z * [new branch] gh/weifengpy/30/head -> origin/gh/weifengpy/30/head 2025-09-07T07:38:46.4952222Z * [new branch] gh/weifengpy/30/orig -> origin/gh/weifengpy/30/orig 2025-09-07T07:38:46.4953319Z * [new branch] gh/williamwen42/196/base -> origin/gh/williamwen42/196/base 2025-09-07T07:38:46.4953792Z * [new branch] gh/williamwen42/196/head -> origin/gh/williamwen42/196/head 2025-09-07T07:38:46.4954513Z * [new branch] gh/williamwen42/196/orig -> origin/gh/williamwen42/196/orig 2025-09-07T07:38:46.4955288Z * [new branch] gh/williamwen42/250/base -> origin/gh/williamwen42/250/base 2025-09-07T07:38:46.4955918Z * [new branch] gh/williamwen42/250/head -> origin/gh/williamwen42/250/head 2025-09-07T07:38:46.4956376Z * [new branch] gh/williamwen42/250/orig -> origin/gh/williamwen42/250/orig 2025-09-07T07:38:46.4957224Z * [new branch] gh/williamwen42/258/base -> origin/gh/williamwen42/258/base 2025-09-07T07:38:46.4957829Z * [new branch] gh/williamwen42/258/head -> origin/gh/williamwen42/258/head 2025-09-07T07:38:46.4958219Z * [new branch] gh/williamwen42/258/orig -> origin/gh/williamwen42/258/orig 2025-09-07T07:38:46.4959065Z * [new branch] gh/williamwen42/266/base -> origin/gh/williamwen42/266/base 2025-09-07T07:38:46.4959502Z * [new branch] gh/williamwen42/266/head -> origin/gh/williamwen42/266/head 2025-09-07T07:38:46.4960100Z * [new branch] gh/williamwen42/266/orig -> origin/gh/williamwen42/266/orig 2025-09-07T07:38:46.4960867Z * [new branch] gh/williamwen42/267/base -> origin/gh/williamwen42/267/base 2025-09-07T07:38:46.4961792Z * [new branch] gh/williamwen42/267/head -> origin/gh/williamwen42/267/head 2025-09-07T07:38:46.4962436Z * [new branch] gh/williamwen42/267/orig -> origin/gh/williamwen42/267/orig 2025-09-07T07:38:46.4963140Z * [new branch] gh/williamwen42/270/base -> origin/gh/williamwen42/270/base 2025-09-07T07:38:46.4963601Z * [new branch] gh/williamwen42/270/head -> origin/gh/williamwen42/270/head 2025-09-07T07:38:46.4964242Z * [new branch] gh/williamwen42/270/orig -> origin/gh/williamwen42/270/orig 2025-09-07T07:38:46.4964927Z * [new branch] gh/williamwen42/271/base -> origin/gh/williamwen42/271/base 2025-09-07T07:38:46.4965639Z * [new branch] gh/williamwen42/271/head -> origin/gh/williamwen42/271/head 2025-09-07T07:38:46.4966050Z * [new branch] gh/williamwen42/271/orig -> origin/gh/williamwen42/271/orig 2025-09-07T07:38:46.4966858Z * [new branch] gh/williamwen42/272/base -> origin/gh/williamwen42/272/base 2025-09-07T07:38:46.4967272Z * [new branch] gh/williamwen42/272/head -> origin/gh/williamwen42/272/head 2025-09-07T07:38:46.4967942Z * [new branch] gh/williamwen42/272/orig -> origin/gh/williamwen42/272/orig 2025-09-07T07:38:46.4968626Z * [new branch] gh/williamwen42/274/base -> origin/gh/williamwen42/274/base 2025-09-07T07:38:46.4969088Z * [new branch] gh/williamwen42/274/head -> origin/gh/williamwen42/274/head 2025-09-07T07:38:46.4969743Z * [new branch] gh/williamwen42/274/orig -> origin/gh/williamwen42/274/orig 2025-09-07T07:38:46.4970466Z * [new branch] gh/williamwen42/275/base -> origin/gh/williamwen42/275/base 2025-09-07T07:38:46.4970880Z * [new branch] gh/williamwen42/275/head -> origin/gh/williamwen42/275/head 2025-09-07T07:38:46.4971630Z * [new branch] gh/williamwen42/276/base -> origin/gh/williamwen42/276/base 2025-09-07T07:38:46.4972046Z * [new branch] gh/williamwen42/276/head -> origin/gh/williamwen42/276/head 2025-09-07T07:38:46.4972638Z * [new branch] gh/williamwen42/276/orig -> origin/gh/williamwen42/276/orig 2025-09-07T07:38:46.4973509Z * [new branch] gh/williamwen42/277/base -> origin/gh/williamwen42/277/base 2025-09-07T07:38:46.4973915Z * [new branch] gh/williamwen42/277/head -> origin/gh/williamwen42/277/head 2025-09-07T07:38:46.4974617Z * [new branch] gh/williamwen42/277/orig -> origin/gh/williamwen42/277/orig 2025-09-07T07:38:46.4975370Z * [new branch] gh/williamwen42/278/base -> origin/gh/williamwen42/278/base 2025-09-07T07:38:46.4975962Z * [new branch] gh/williamwen42/278/head -> origin/gh/williamwen42/278/head 2025-09-07T07:38:46.4976341Z * [new branch] gh/williamwen42/278/orig -> origin/gh/williamwen42/278/orig 2025-09-07T07:38:46.4977242Z * [new branch] gh/williamwen42/279/base -> origin/gh/williamwen42/279/base 2025-09-07T07:38:46.4977691Z * [new branch] gh/williamwen42/279/head -> origin/gh/williamwen42/279/head 2025-09-07T07:38:46.4978273Z * [new branch] gh/williamwen42/279/orig -> origin/gh/williamwen42/279/orig 2025-09-07T07:38:46.4979006Z * [new branch] gh/williamwen42/280/base -> origin/gh/williamwen42/280/base 2025-09-07T07:38:46.4979446Z * [new branch] gh/williamwen42/280/head -> origin/gh/williamwen42/280/head 2025-09-07T07:38:46.4979970Z * [new branch] gh/williamwen42/280/orig -> origin/gh/williamwen42/280/orig 2025-09-07T07:38:46.4980780Z * [new branch] gh/williamwen42/281/base -> origin/gh/williamwen42/281/base 2025-09-07T07:38:46.4981182Z * [new branch] gh/williamwen42/281/head -> origin/gh/williamwen42/281/head 2025-09-07T07:38:46.4981800Z * [new branch] gh/williamwen42/281/orig -> origin/gh/williamwen42/281/orig 2025-09-07T07:38:46.4982724Z * [new branch] gh/williamwen42/282/base -> origin/gh/williamwen42/282/base 2025-09-07T07:38:46.4983169Z * [new branch] gh/williamwen42/282/head -> origin/gh/williamwen42/282/head 2025-09-07T07:38:46.4983858Z * [new branch] gh/williamwen42/282/orig -> origin/gh/williamwen42/282/orig 2025-09-07T07:38:46.4984771Z * [new branch] gh/williamwen42/283/base -> origin/gh/williamwen42/283/base 2025-09-07T07:38:46.4985464Z * [new branch] gh/williamwen42/283/head -> origin/gh/williamwen42/283/head 2025-09-07T07:38:46.4985774Z * [new branch] gh/williamwen42/283/orig -> origin/gh/williamwen42/283/orig 2025-09-07T07:38:46.4986843Z * [new branch] gh/williamwen42/284/base -> origin/gh/williamwen42/284/base 2025-09-07T07:38:46.4987299Z * [new branch] gh/williamwen42/284/head -> origin/gh/williamwen42/284/head 2025-09-07T07:38:46.4987891Z * [new branch] gh/williamwen42/284/orig -> origin/gh/williamwen42/284/orig 2025-09-07T07:38:46.4988484Z * [new branch] gh/williamwen42/285/base -> origin/gh/williamwen42/285/base 2025-09-07T07:38:46.4989663Z * [new branch] gh/williamwen42/285/head -> origin/gh/williamwen42/285/head 2025-09-07T07:38:46.4989811Z * [new branch] gh/williamwen42/285/orig -> origin/gh/williamwen42/285/orig 2025-09-07T07:38:46.4990042Z * [new branch] gh/williamwen42/286/base -> origin/gh/williamwen42/286/base 2025-09-07T07:38:46.4991784Z * [new branch] gh/williamwen42/286/head -> origin/gh/williamwen42/286/head 2025-09-07T07:38:46.4991914Z * [new branch] gh/williamwen42/286/orig -> origin/gh/williamwen42/286/orig 2025-09-07T07:38:46.4992055Z * [new branch] gh/williamwen42/287/base -> origin/gh/williamwen42/287/base 2025-09-07T07:38:46.4992392Z * [new branch] gh/williamwen42/287/head -> origin/gh/williamwen42/287/head 2025-09-07T07:38:46.4993200Z * [new branch] gh/williamwen42/287/orig -> origin/gh/williamwen42/287/orig 2025-09-07T07:38:46.4994036Z * [new branch] gh/williamwen42/288/base -> origin/gh/williamwen42/288/base 2025-09-07T07:38:46.4994457Z * [new branch] gh/williamwen42/288/head -> origin/gh/williamwen42/288/head 2025-09-07T07:38:46.4994974Z * [new branch] gh/williamwen42/288/orig -> origin/gh/williamwen42/288/orig 2025-09-07T07:38:46.4995793Z * [new branch] gh/williamwen42/289/base -> origin/gh/williamwen42/289/base 2025-09-07T07:38:46.4996218Z * [new branch] gh/williamwen42/289/head -> origin/gh/williamwen42/289/head 2025-09-07T07:38:46.4996794Z * [new branch] gh/williamwen42/289/orig -> origin/gh/williamwen42/289/orig 2025-09-07T07:38:46.4997955Z * [new branch] gh/wychi/1/base -> origin/gh/wychi/1/base 2025-09-07T07:38:46.4998556Z * [new branch] gh/wychi/1/head -> origin/gh/wychi/1/head 2025-09-07T07:38:46.4999224Z * [new branch] gh/wychi/1/orig -> origin/gh/wychi/1/orig 2025-09-07T07:38:46.5000404Z * [new branch] gh/xmfan/169/base -> origin/gh/xmfan/169/base 2025-09-07T07:38:46.5000842Z * [new branch] gh/xmfan/169/head -> origin/gh/xmfan/169/head 2025-09-07T07:38:46.5001594Z * [new branch] gh/xmfan/170/base -> origin/gh/xmfan/170/base 2025-09-07T07:38:46.5001999Z * [new branch] gh/xmfan/170/head -> origin/gh/xmfan/170/head 2025-09-07T07:38:46.5002987Z * [new branch] gh/xmfan/18/base -> origin/gh/xmfan/18/base 2025-09-07T07:38:46.5003404Z * [new branch] gh/xmfan/18/head -> origin/gh/xmfan/18/head 2025-09-07T07:38:46.5004177Z * [new branch] gh/xmfan/229/base -> origin/gh/xmfan/229/base 2025-09-07T07:38:46.5004602Z * [new branch] gh/xmfan/229/head -> origin/gh/xmfan/229/head 2025-09-07T07:38:46.5005194Z * [new branch] gh/xmfan/229/orig -> origin/gh/xmfan/229/orig 2025-09-07T07:38:46.5006264Z * [new branch] gh/xmfan/237/base -> origin/gh/xmfan/237/base 2025-09-07T07:38:46.5006734Z * [new branch] gh/xmfan/237/head -> origin/gh/xmfan/237/head 2025-09-07T07:38:46.5007339Z * [new branch] gh/xmfan/237/orig -> origin/gh/xmfan/237/orig 2025-09-07T07:38:46.5008310Z * [new branch] gh/xmfan/244/base -> origin/gh/xmfan/244/base 2025-09-07T07:38:46.5008721Z * [new branch] gh/xmfan/244/head -> origin/gh/xmfan/244/head 2025-09-07T07:38:46.5009345Z * [new branch] gh/xmfan/244/orig -> origin/gh/xmfan/244/orig 2025-09-07T07:38:46.5010014Z * [new branch] gh/xmfan/246/base -> origin/gh/xmfan/246/base 2025-09-07T07:38:46.5010424Z * [new branch] gh/xmfan/246/head -> origin/gh/xmfan/246/head 2025-09-07T07:38:46.5011005Z * [new branch] gh/xmfan/246/orig -> origin/gh/xmfan/246/orig 2025-09-07T07:38:46.5011743Z * [new branch] gh/xmfan/253/base -> origin/gh/xmfan/253/base 2025-09-07T07:38:46.5012417Z * [new branch] gh/xmfan/253/head -> origin/gh/xmfan/253/head 2025-09-07T07:38:46.5012839Z * [new branch] gh/xmfan/253/orig -> origin/gh/xmfan/253/orig 2025-09-07T07:38:46.5013703Z * [new branch] gh/xmfan/254/base -> origin/gh/xmfan/254/base 2025-09-07T07:38:46.5014137Z * [new branch] gh/xmfan/254/head -> origin/gh/xmfan/254/head 2025-09-07T07:38:46.5014738Z * [new branch] gh/xmfan/254/orig -> origin/gh/xmfan/254/orig 2025-09-07T07:38:46.5015425Z * [new branch] gh/xmfan/260/base -> origin/gh/xmfan/260/base 2025-09-07T07:38:46.5015840Z * [new branch] gh/xmfan/260/head -> origin/gh/xmfan/260/head 2025-09-07T07:38:46.5016480Z * [new branch] gh/xmfan/260/orig -> origin/gh/xmfan/260/orig 2025-09-07T07:38:46.5017145Z * [new branch] gh/xmfan/262/base -> origin/gh/xmfan/262/base 2025-09-07T07:38:46.5017545Z * [new branch] gh/xmfan/262/head -> origin/gh/xmfan/262/head 2025-09-07T07:38:46.5018155Z * [new branch] gh/xmfan/262/orig -> origin/gh/xmfan/262/orig 2025-09-07T07:38:46.5018850Z * [new branch] gh/xmfan/263/base -> origin/gh/xmfan/263/base 2025-09-07T07:38:46.5019229Z * [new branch] gh/xmfan/263/head -> origin/gh/xmfan/263/head 2025-09-07T07:38:46.5019828Z * [new branch] gh/xmfan/263/orig -> origin/gh/xmfan/263/orig 2025-09-07T07:38:46.5020542Z * [new branch] gh/xmfan/264/base -> origin/gh/xmfan/264/base 2025-09-07T07:38:46.5021169Z * [new branch] gh/xmfan/264/head -> origin/gh/xmfan/264/head 2025-09-07T07:38:46.5021547Z * [new branch] gh/xmfan/264/orig -> origin/gh/xmfan/264/orig 2025-09-07T07:38:46.5022340Z * [new branch] gh/xmfan/274/base -> origin/gh/xmfan/274/base 2025-09-07T07:38:46.5022788Z * [new branch] gh/xmfan/274/head -> origin/gh/xmfan/274/head 2025-09-07T07:38:46.5023386Z * [new branch] gh/xmfan/274/orig -> origin/gh/xmfan/274/orig 2025-09-07T07:38:46.5024093Z * [new branch] gh/xmfan/276/base -> origin/gh/xmfan/276/base 2025-09-07T07:38:46.5024519Z * [new branch] gh/xmfan/276/head -> origin/gh/xmfan/276/head 2025-09-07T07:38:46.5025090Z * [new branch] gh/xmfan/276/orig -> origin/gh/xmfan/276/orig 2025-09-07T07:38:46.5026059Z * [new branch] gh/xmfan/277/base -> origin/gh/xmfan/277/base 2025-09-07T07:38:46.5026474Z * [new branch] gh/xmfan/277/head -> origin/gh/xmfan/277/head 2025-09-07T07:38:46.5027035Z * [new branch] gh/xmfan/277/orig -> origin/gh/xmfan/277/orig 2025-09-07T07:38:46.5028058Z * [new branch] gh/xmfan/278/base -> origin/gh/xmfan/278/base 2025-09-07T07:38:46.5028460Z * [new branch] gh/xmfan/278/head -> origin/gh/xmfan/278/head 2025-09-07T07:38:46.5029099Z * [new branch] gh/xmfan/278/orig -> origin/gh/xmfan/278/orig 2025-09-07T07:38:46.5030042Z * [new branch] gh/xmfan/279/base -> origin/gh/xmfan/279/base 2025-09-07T07:38:46.5030677Z * [new branch] gh/xmfan/279/head -> origin/gh/xmfan/279/head 2025-09-07T07:38:46.5031102Z * [new branch] gh/xmfan/279/orig -> origin/gh/xmfan/279/orig 2025-09-07T07:38:46.5031890Z * [new branch] gh/xmfan/280/base -> origin/gh/xmfan/280/base 2025-09-07T07:38:46.5032294Z * [new branch] gh/xmfan/280/head -> origin/gh/xmfan/280/head 2025-09-07T07:38:46.5032879Z * [new branch] gh/xmfan/280/orig -> origin/gh/xmfan/280/orig 2025-09-07T07:38:46.5033571Z * [new branch] gh/xmfan/281/base -> origin/gh/xmfan/281/base 2025-09-07T07:38:46.5033972Z * [new branch] gh/xmfan/281/head -> origin/gh/xmfan/281/head 2025-09-07T07:38:46.5034608Z * [new branch] gh/xmfan/281/orig -> origin/gh/xmfan/281/orig 2025-09-07T07:38:46.5035378Z * [new branch] gh/xmfan/282/base -> origin/gh/xmfan/282/base 2025-09-07T07:38:46.5035828Z * [new branch] gh/xmfan/282/head -> origin/gh/xmfan/282/head 2025-09-07T07:38:46.5036572Z * [new branch] gh/xmfan/283/base -> origin/gh/xmfan/283/base 2025-09-07T07:38:46.5037028Z * [new branch] gh/xmfan/283/head -> origin/gh/xmfan/283/head 2025-09-07T07:38:46.5037595Z * [new branch] gh/xmfan/283/orig -> origin/gh/xmfan/283/orig 2025-09-07T07:38:46.5038492Z * [new branch] gh/xuanzhang816/14/base -> origin/gh/xuanzhang816/14/base 2025-09-07T07:38:46.5041436Z * [new branch] gh/xuanzhang816/14/head -> origin/gh/xuanzhang816/14/head 2025-09-07T07:38:46.5042182Z * [new branch] gh/xuanzhang816/14/orig -> origin/gh/xuanzhang816/14/orig 2025-09-07T07:38:46.5042925Z * [new branch] gh/xuanzhang816/19/base -> origin/gh/xuanzhang816/19/base 2025-09-07T07:38:46.5043332Z * [new branch] gh/xuanzhang816/19/head -> origin/gh/xuanzhang816/19/head 2025-09-07T07:38:46.5044388Z * [new branch] gh/xuanzhang816/19/orig -> origin/gh/xuanzhang816/19/orig 2025-09-07T07:38:46.5045137Z * [new branch] gh/xuanzhang816/22/base -> origin/gh/xuanzhang816/22/base 2025-09-07T07:38:46.5045620Z * [new branch] gh/xuanzhang816/22/head -> origin/gh/xuanzhang816/22/head 2025-09-07T07:38:46.5046216Z * [new branch] gh/xuanzhang816/22/orig -> origin/gh/xuanzhang816/22/orig 2025-09-07T07:38:46.5046880Z * [new branch] gh/xuanzhang816/23/base -> origin/gh/xuanzhang816/23/base 2025-09-07T07:38:46.5047287Z * [new branch] gh/xuanzhang816/23/head -> origin/gh/xuanzhang816/23/head 2025-09-07T07:38:46.5047874Z * [new branch] gh/xuanzhang816/23/orig -> origin/gh/xuanzhang816/23/orig 2025-09-07T07:38:46.5048583Z * [new branch] gh/xuanzhang816/24/base -> origin/gh/xuanzhang816/24/base 2025-09-07T07:38:46.5048991Z * [new branch] gh/xuanzhang816/24/head -> origin/gh/xuanzhang816/24/head 2025-09-07T07:38:46.5049608Z * [new branch] gh/xuanzhang816/24/orig -> origin/gh/xuanzhang816/24/orig 2025-09-07T07:38:46.5050306Z * [new branch] gh/xuanzhang816/25/base -> origin/gh/xuanzhang816/25/base 2025-09-07T07:38:46.5050966Z * [new branch] gh/xuanzhang816/25/head -> origin/gh/xuanzhang816/25/head 2025-09-07T07:38:46.5051382Z * [new branch] gh/xuanzhang816/25/orig -> origin/gh/xuanzhang816/25/orig 2025-09-07T07:38:46.5052245Z * [new branch] gh/xuanzhang816/26/base -> origin/gh/xuanzhang816/26/base 2025-09-07T07:38:46.5052643Z * [new branch] gh/xuanzhang816/26/head -> origin/gh/xuanzhang816/26/head 2025-09-07T07:38:46.5053285Z * [new branch] gh/xuanzhang816/26/orig -> origin/gh/xuanzhang816/26/orig 2025-09-07T07:38:46.5054220Z * [new branch] gh/yanbing-j/11/base -> origin/gh/yanbing-j/11/base 2025-09-07T07:38:46.5054625Z * [new branch] gh/yanbing-j/11/head -> origin/gh/yanbing-j/11/head 2025-09-07T07:38:46.5055254Z * [new branch] gh/yanbing-j/11/orig -> origin/gh/yanbing-j/11/orig 2025-09-07T07:38:46.5055979Z * [new branch] gh/yanbing-j/12/base -> origin/gh/yanbing-j/12/base 2025-09-07T07:38:46.5056415Z * [new branch] gh/yanbing-j/12/head -> origin/gh/yanbing-j/12/head 2025-09-07T07:38:46.5056989Z * [new branch] gh/yanbing-j/12/orig -> origin/gh/yanbing-j/12/orig 2025-09-07T07:38:46.5057727Z * [new branch] gh/yanbing-j/13/base -> origin/gh/yanbing-j/13/base 2025-09-07T07:38:46.5058157Z * [new branch] gh/yanbing-j/13/head -> origin/gh/yanbing-j/13/head 2025-09-07T07:38:46.5058735Z * [new branch] gh/yanbing-j/13/orig -> origin/gh/yanbing-j/13/orig 2025-09-07T07:38:46.5059481Z * [new branch] gh/yanbing-j/14/base -> origin/gh/yanbing-j/14/base 2025-09-07T07:38:46.5060129Z * [new branch] gh/yanbing-j/14/head -> origin/gh/yanbing-j/14/head 2025-09-07T07:38:46.5060532Z * [new branch] gh/yanbing-j/14/orig -> origin/gh/yanbing-j/14/orig 2025-09-07T07:38:46.5061293Z * [new branch] gh/yanbing-j/15/base -> origin/gh/yanbing-j/15/base 2025-09-07T07:38:46.5061752Z * [new branch] gh/yanbing-j/15/head -> origin/gh/yanbing-j/15/head 2025-09-07T07:38:46.5062322Z * [new branch] gh/yanbing-j/15/orig -> origin/gh/yanbing-j/15/orig 2025-09-07T07:38:46.5062979Z * [new branch] gh/yanbing-j/18/base -> origin/gh/yanbing-j/18/base 2025-09-07T07:38:46.5063365Z * [new branch] gh/yanbing-j/18/head -> origin/gh/yanbing-j/18/head 2025-09-07T07:38:46.5063949Z * [new branch] gh/yanbing-j/18/orig -> origin/gh/yanbing-j/18/orig 2025-09-07T07:38:46.5064634Z * [new branch] gh/yanbing-j/19/base -> origin/gh/yanbing-j/19/base 2025-09-07T07:38:46.5065039Z * [new branch] gh/yanbing-j/19/head -> origin/gh/yanbing-j/19/head 2025-09-07T07:38:46.5065621Z * [new branch] gh/yanbing-j/19/orig -> origin/gh/yanbing-j/19/orig 2025-09-07T07:38:46.5066421Z * [new branch] gh/yanbing-j/20/base -> origin/gh/yanbing-j/20/base 2025-09-07T07:38:46.5066825Z * [new branch] gh/yanbing-j/20/head -> origin/gh/yanbing-j/20/head 2025-09-07T07:38:46.5067401Z * [new branch] gh/yanbing-j/20/orig -> origin/gh/yanbing-j/20/orig 2025-09-07T07:38:46.5068438Z * [new branch] gh/yanbing-j/21/base -> origin/gh/yanbing-j/21/base 2025-09-07T07:38:46.5069081Z * [new branch] gh/yanbing-j/21/head -> origin/gh/yanbing-j/21/head 2025-09-07T07:38:46.5069833Z * [new branch] gh/yanbing-j/22/base -> origin/gh/yanbing-j/22/base 2025-09-07T07:38:46.5070259Z * [new branch] gh/yanbing-j/22/head -> origin/gh/yanbing-j/22/head 2025-09-07T07:38:46.5070864Z * [new branch] gh/yanbing-j/22/orig -> origin/gh/yanbing-j/22/orig 2025-09-07T07:38:46.5071550Z * [new branch] gh/yanbing-j/23/base -> origin/gh/yanbing-j/23/base 2025-09-07T07:38:46.5071996Z * [new branch] gh/yanbing-j/23/head -> origin/gh/yanbing-j/23/head 2025-09-07T07:38:46.5072638Z * [new branch] gh/yanbing-j/23/orig -> origin/gh/yanbing-j/23/orig 2025-09-07T07:38:46.5073337Z * [new branch] gh/yanbing-j/24/base -> origin/gh/yanbing-j/24/base 2025-09-07T07:38:46.5074101Z * [new branch] gh/yanbing-j/24/head -> origin/gh/yanbing-j/24/head 2025-09-07T07:38:46.5074552Z * [new branch] gh/yanbing-j/24/orig -> origin/gh/yanbing-j/24/orig 2025-09-07T07:38:46.5075318Z * [new branch] gh/yanbing-j/25/base -> origin/gh/yanbing-j/25/base 2025-09-07T07:38:46.5075731Z * [new branch] gh/yanbing-j/25/head -> origin/gh/yanbing-j/25/head 2025-09-07T07:38:46.5076308Z * [new branch] gh/yanbing-j/25/orig -> origin/gh/yanbing-j/25/orig 2025-09-07T07:38:46.5076999Z * [new branch] gh/yanbing-j/26/base -> origin/gh/yanbing-j/26/base 2025-09-07T07:38:46.5077426Z * [new branch] gh/yanbing-j/26/head -> origin/gh/yanbing-j/26/head 2025-09-07T07:38:46.5078106Z * [new branch] gh/yanbing-j/26/orig -> origin/gh/yanbing-j/26/orig 2025-09-07T07:38:46.5078839Z * [new branch] gh/yanbing-j/36/base -> origin/gh/yanbing-j/36/base 2025-09-07T07:38:46.5079220Z * [new branch] gh/yanbing-j/36/head -> origin/gh/yanbing-j/36/head 2025-09-07T07:38:46.5079845Z * [new branch] gh/yanbing-j/36/orig -> origin/gh/yanbing-j/36/orig 2025-09-07T07:38:46.5080593Z * [new branch] gh/yanbing-j/37/base -> origin/gh/yanbing-j/37/base 2025-09-07T07:38:46.5081054Z * [new branch] gh/yanbing-j/37/head -> origin/gh/yanbing-j/37/head 2025-09-07T07:38:46.5085325Z * [new branch] gh/yanbing-j/37/orig -> origin/gh/yanbing-j/37/orig 2025-09-07T07:38:46.5086520Z * [new branch] gh/yangw-dev/12/base -> origin/gh/yangw-dev/12/base 2025-09-07T07:38:46.5086953Z * [new branch] gh/yangw-dev/12/head -> origin/gh/yangw-dev/12/head 2025-09-07T07:38:46.5087527Z * [new branch] gh/yangw-dev/12/orig -> origin/gh/yangw-dev/12/orig 2025-09-07T07:38:46.5088276Z * [new branch] gh/yangw-dev/13/base -> origin/gh/yangw-dev/13/base 2025-09-07T07:38:46.5088729Z * [new branch] gh/yangw-dev/13/head -> origin/gh/yangw-dev/13/head 2025-09-07T07:38:46.5089312Z * [new branch] gh/yangw-dev/13/orig -> origin/gh/yangw-dev/13/orig 2025-09-07T07:38:46.5089999Z * [new branch] gh/yangw-dev/14/base -> origin/gh/yangw-dev/14/base 2025-09-07T07:38:46.5090415Z * [new branch] gh/yangw-dev/14/head -> origin/gh/yangw-dev/14/head 2025-09-07T07:38:46.5091155Z * [new branch] gh/yangw-dev/14/orig -> origin/gh/yangw-dev/14/orig 2025-09-07T07:38:46.5091951Z * [new branch] gh/yangw-dev/15/base -> origin/gh/yangw-dev/15/base 2025-09-07T07:38:46.5092394Z * [new branch] gh/yangw-dev/15/head -> origin/gh/yangw-dev/15/head 2025-09-07T07:38:46.5093041Z * [new branch] gh/yangw-dev/15/orig -> origin/gh/yangw-dev/15/orig 2025-09-07T07:38:46.5093698Z * [new branch] gh/yangw-dev/16/base -> origin/gh/yangw-dev/16/base 2025-09-07T07:38:46.5094099Z * [new branch] gh/yangw-dev/16/head -> origin/gh/yangw-dev/16/head 2025-09-07T07:38:46.5094680Z * [new branch] gh/yangw-dev/16/orig -> origin/gh/yangw-dev/16/orig 2025-09-07T07:38:46.5095392Z * [new branch] gh/yangw-dev/17/base -> origin/gh/yangw-dev/17/base 2025-09-07T07:38:46.5095802Z * [new branch] gh/yangw-dev/17/head -> origin/gh/yangw-dev/17/head 2025-09-07T07:38:46.5096653Z * [new branch] gh/yangw-dev/17/orig -> origin/gh/yangw-dev/17/orig 2025-09-07T07:38:46.5097354Z * [new branch] gh/yangw-dev/18/base -> origin/gh/yangw-dev/18/base 2025-09-07T07:38:46.5097750Z * [new branch] gh/yangw-dev/18/head -> origin/gh/yangw-dev/18/head 2025-09-07T07:38:46.5098329Z * [new branch] gh/yangw-dev/18/orig -> origin/gh/yangw-dev/18/orig 2025-09-07T07:38:46.5099098Z * [new branch] gh/yangw-dev/19/base -> origin/gh/yangw-dev/19/base 2025-09-07T07:38:46.5099769Z * [new branch] gh/yangw-dev/19/head -> origin/gh/yangw-dev/19/head 2025-09-07T07:38:46.5100123Z * [new branch] gh/yangw-dev/19/orig -> origin/gh/yangw-dev/19/orig 2025-09-07T07:38:46.5100909Z * [new branch] gh/yangw-dev/20/base -> origin/gh/yangw-dev/20/base 2025-09-07T07:38:46.5101354Z * [new branch] gh/yangw-dev/20/head -> origin/gh/yangw-dev/20/head 2025-09-07T07:38:46.5101921Z * [new branch] gh/yangw-dev/20/orig -> origin/gh/yangw-dev/20/orig 2025-09-07T07:38:46.5102655Z * [new branch] gh/yangw-dev/21/base -> origin/gh/yangw-dev/21/base 2025-09-07T07:38:46.5103066Z * [new branch] gh/yangw-dev/21/head -> origin/gh/yangw-dev/21/head 2025-09-07T07:38:46.5103650Z * [new branch] gh/yangw-dev/21/orig -> origin/gh/yangw-dev/21/orig 2025-09-07T07:38:46.5104348Z * [new branch] gh/yangw-dev/22/base -> origin/gh/yangw-dev/22/base 2025-09-07T07:38:46.5104804Z * [new branch] gh/yangw-dev/22/head -> origin/gh/yangw-dev/22/head 2025-09-07T07:38:46.5105411Z * [new branch] gh/yangw-dev/22/orig -> origin/gh/yangw-dev/22/orig 2025-09-07T07:38:46.5106114Z * [new branch] gh/yangw-dev/23/base -> origin/gh/yangw-dev/23/base 2025-09-07T07:38:46.5106537Z * [new branch] gh/yangw-dev/23/head -> origin/gh/yangw-dev/23/head 2025-09-07T07:38:46.5107109Z * [new branch] gh/yangw-dev/23/orig -> origin/gh/yangw-dev/23/orig 2025-09-07T07:38:46.5107786Z * [new branch] gh/yangw-dev/24/base -> origin/gh/yangw-dev/24/base 2025-09-07T07:38:46.5108425Z * [new branch] gh/yangw-dev/24/head -> origin/gh/yangw-dev/24/head 2025-09-07T07:38:46.5108833Z * [new branch] gh/yangw-dev/24/orig -> origin/gh/yangw-dev/24/orig 2025-09-07T07:38:46.5109624Z * [new branch] gh/yangw-dev/25/base -> origin/gh/yangw-dev/25/base 2025-09-07T07:38:46.5110048Z * [new branch] gh/yangw-dev/25/head -> origin/gh/yangw-dev/25/head 2025-09-07T07:38:46.5110659Z * [new branch] gh/yangw-dev/25/orig -> origin/gh/yangw-dev/25/orig 2025-09-07T07:38:46.5111364Z * [new branch] gh/yangw-dev/26/base -> origin/gh/yangw-dev/26/base 2025-09-07T07:38:46.5111765Z * [new branch] gh/yangw-dev/26/head -> origin/gh/yangw-dev/26/head 2025-09-07T07:38:46.5112349Z * [new branch] gh/yangw-dev/26/orig -> origin/gh/yangw-dev/26/orig 2025-09-07T07:38:46.5113123Z * [new branch] gh/yangw-dev/27/base -> origin/gh/yangw-dev/27/base 2025-09-07T07:38:46.5113568Z * [new branch] gh/yangw-dev/27/head -> origin/gh/yangw-dev/27/head 2025-09-07T07:38:46.5114144Z * [new branch] gh/yangw-dev/27/orig -> origin/gh/yangw-dev/27/orig 2025-09-07T07:38:46.5115040Z * [new branch] gh/ydwu4/233/base -> origin/gh/ydwu4/233/base 2025-09-07T07:38:46.5115902Z * [new branch] gh/ydwu4/233/head -> origin/gh/ydwu4/233/head 2025-09-07T07:38:46.5116296Z * [new branch] gh/ydwu4/233/orig -> origin/gh/ydwu4/233/orig 2025-09-07T07:38:46.5117269Z * [new branch] gh/ydwu4/246/base -> origin/gh/ydwu4/246/base 2025-09-07T07:38:46.5117902Z * [new branch] gh/ydwu4/246/head -> origin/gh/ydwu4/246/head 2025-09-07T07:38:46.5118337Z * [new branch] gh/ydwu4/246/orig -> origin/gh/ydwu4/246/orig 2025-09-07T07:38:46.5119218Z * [new branch] gh/ydwu4/253/base -> origin/gh/ydwu4/253/base 2025-09-07T07:38:46.5119789Z * [new branch] gh/ydwu4/253/head -> origin/gh/ydwu4/253/head 2025-09-07T07:38:46.5120350Z * [new branch] gh/ydwu4/253/orig -> origin/gh/ydwu4/253/orig 2025-09-07T07:38:46.5121057Z * [new branch] gh/ydwu4/255/base -> origin/gh/ydwu4/255/base 2025-09-07T07:38:46.5121506Z * [new branch] gh/ydwu4/255/head -> origin/gh/ydwu4/255/head 2025-09-07T07:38:46.5122067Z * [new branch] gh/ydwu4/255/orig -> origin/gh/ydwu4/255/orig 2025-09-07T07:38:46.5122869Z * [new branch] gh/ydwu4/259/base -> origin/gh/ydwu4/259/base 2025-09-07T07:38:46.5123300Z * [new branch] gh/ydwu4/259/head -> origin/gh/ydwu4/259/head 2025-09-07T07:38:46.5123877Z * [new branch] gh/ydwu4/259/orig -> origin/gh/ydwu4/259/orig 2025-09-07T07:38:46.5124665Z * [new branch] gh/ydwu4/262/base -> origin/gh/ydwu4/262/base 2025-09-07T07:38:46.5125187Z * [new branch] gh/ydwu4/262/head -> origin/gh/ydwu4/262/head 2025-09-07T07:38:46.5125685Z * [new branch] gh/ydwu4/262/orig -> origin/gh/ydwu4/262/orig 2025-09-07T07:38:46.5126426Z * [new branch] gh/ydwu4/263/base -> origin/gh/ydwu4/263/base 2025-09-07T07:38:46.5127036Z * [new branch] gh/ydwu4/263/head -> origin/gh/ydwu4/263/head 2025-09-07T07:38:46.5127597Z * [new branch] gh/ydwu4/263/orig -> origin/gh/ydwu4/263/orig 2025-09-07T07:38:46.5128437Z * [new branch] gh/ydwu4/269/base -> origin/gh/ydwu4/269/base 2025-09-07T07:38:46.5128839Z * [new branch] gh/ydwu4/269/head -> origin/gh/ydwu4/269/head 2025-09-07T07:38:46.5129352Z * [new branch] gh/ydwu4/269/orig -> origin/gh/ydwu4/269/orig 2025-09-07T07:38:46.5130121Z * [new branch] gh/ydwu4/270/base -> origin/gh/ydwu4/270/base 2025-09-07T07:38:46.5130686Z * [new branch] gh/ydwu4/270/head -> origin/gh/ydwu4/270/head 2025-09-07T07:38:46.5131140Z * [new branch] gh/ydwu4/270/orig -> origin/gh/ydwu4/270/orig 2025-09-07T07:38:46.5131933Z * [new branch] gh/ydwu4/272/base -> origin/gh/ydwu4/272/base 2025-09-07T07:38:46.5132499Z * [new branch] gh/ydwu4/272/head -> origin/gh/ydwu4/272/head 2025-09-07T07:38:46.5133072Z * [new branch] gh/ydwu4/272/orig -> origin/gh/ydwu4/272/orig 2025-09-07T07:38:46.5133669Z * [new branch] gh/ydwu4/275/base -> origin/gh/ydwu4/275/base 2025-09-07T07:38:46.5134094Z * [new branch] gh/ydwu4/275/head -> origin/gh/ydwu4/275/head 2025-09-07T07:38:46.5135039Z * [new branch] gh/ydwu4/275/orig -> origin/gh/ydwu4/275/orig 2025-09-07T07:38:46.5135688Z * [new branch] gh/ydwu4/276/base -> origin/gh/ydwu4/276/base 2025-09-07T07:38:46.5136321Z * [new branch] gh/ydwu4/276/head -> origin/gh/ydwu4/276/head 2025-09-07T07:38:46.5136715Z * [new branch] gh/ydwu4/276/orig -> origin/gh/ydwu4/276/orig 2025-09-07T07:38:46.5137740Z * [new branch] gh/ydwu4/279/base -> origin/gh/ydwu4/279/base 2025-09-07T07:38:46.5138349Z * [new branch] gh/ydwu4/279/head -> origin/gh/ydwu4/279/head 2025-09-07T07:38:46.5138816Z * [new branch] gh/ydwu4/279/orig -> origin/gh/ydwu4/279/orig 2025-09-07T07:38:46.5139786Z * [new branch] gh/ydwu4/283/base -> origin/gh/ydwu4/283/base 2025-09-07T07:38:46.5140248Z * [new branch] gh/ydwu4/283/head -> origin/gh/ydwu4/283/head 2025-09-07T07:38:46.5140813Z * [new branch] gh/ydwu4/283/orig -> origin/gh/ydwu4/283/orig 2025-09-07T07:38:46.5141510Z * [new branch] gh/ydwu4/289/base -> origin/gh/ydwu4/289/base 2025-09-07T07:38:46.5141938Z * [new branch] gh/ydwu4/289/head -> origin/gh/ydwu4/289/head 2025-09-07T07:38:46.5142589Z * [new branch] gh/ydwu4/289/orig -> origin/gh/ydwu4/289/orig 2025-09-07T07:38:46.5143434Z * [new branch] gh/ydwu4/290/base -> origin/gh/ydwu4/290/base 2025-09-07T07:38:46.5143870Z * [new branch] gh/ydwu4/290/head -> origin/gh/ydwu4/290/head 2025-09-07T07:38:46.5144448Z * [new branch] gh/ydwu4/290/orig -> origin/gh/ydwu4/290/orig 2025-09-07T07:38:46.5145650Z * [new branch] gh/ydwu4/291/base -> origin/gh/ydwu4/291/base 2025-09-07T07:38:46.5146156Z * [new branch] gh/ydwu4/291/head -> origin/gh/ydwu4/291/head 2025-09-07T07:38:46.5146785Z * [new branch] gh/ydwu4/291/orig -> origin/gh/ydwu4/291/orig 2025-09-07T07:38:46.5147565Z * [new branch] gh/ydwu4/292/base -> origin/gh/ydwu4/292/base 2025-09-07T07:38:46.5147933Z * [new branch] gh/ydwu4/292/head -> origin/gh/ydwu4/292/head 2025-09-07T07:38:46.5148489Z * [new branch] gh/ydwu4/292/orig -> origin/gh/ydwu4/292/orig 2025-09-07T07:38:46.5149203Z * [new branch] gh/ydwu4/293/base -> origin/gh/ydwu4/293/base 2025-09-07T07:38:46.5149638Z * [new branch] gh/ydwu4/293/head -> origin/gh/ydwu4/293/head 2025-09-07T07:38:46.5150293Z * [new branch] gh/ydwu4/293/orig -> origin/gh/ydwu4/293/orig 2025-09-07T07:38:46.5151033Z * [new branch] gh/ydwu4/294/base -> origin/gh/ydwu4/294/base 2025-09-07T07:38:46.5151474Z * [new branch] gh/ydwu4/294/head -> origin/gh/ydwu4/294/head 2025-09-07T07:38:46.5152040Z * [new branch] gh/ydwu4/294/orig -> origin/gh/ydwu4/294/orig 2025-09-07T07:38:46.5152808Z * [new branch] gh/ydwu4/295/base -> origin/gh/ydwu4/295/base 2025-09-07T07:38:46.5153440Z * [new branch] gh/ydwu4/295/head -> origin/gh/ydwu4/295/head 2025-09-07T07:38:46.5153778Z * [new branch] gh/ydwu4/295/orig -> origin/gh/ydwu4/295/orig 2025-09-07T07:38:46.5154674Z * [new branch] gh/ydwu4/296/base -> origin/gh/ydwu4/296/base 2025-09-07T07:38:46.5155075Z * [new branch] gh/ydwu4/296/head -> origin/gh/ydwu4/296/head 2025-09-07T07:38:46.5155634Z * [new branch] gh/ydwu4/296/orig -> origin/gh/ydwu4/296/orig 2025-09-07T07:38:46.5157001Z * [new branch] gh/ydwu4/300/base -> origin/gh/ydwu4/300/base 2025-09-07T07:38:46.5157826Z * [new branch] gh/ydwu4/300/head -> origin/gh/ydwu4/300/head 2025-09-07T07:38:46.5158461Z * [new branch] gh/ydwu4/300/orig -> origin/gh/ydwu4/300/orig 2025-09-07T07:38:46.5159435Z * [new branch] gh/ydwu4/301/base -> origin/gh/ydwu4/301/base 2025-09-07T07:38:46.5159827Z * [new branch] gh/ydwu4/301/head -> origin/gh/ydwu4/301/head 2025-09-07T07:38:46.5160390Z * [new branch] gh/ydwu4/301/orig -> origin/gh/ydwu4/301/orig 2025-09-07T07:38:46.5161128Z * [new branch] gh/ydwu4/302/base -> origin/gh/ydwu4/302/base 2025-09-07T07:38:46.5161549Z * [new branch] gh/ydwu4/302/head -> origin/gh/ydwu4/302/head 2025-09-07T07:38:46.5162119Z * [new branch] gh/ydwu4/302/orig -> origin/gh/ydwu4/302/orig 2025-09-07T07:38:46.5162758Z * [new branch] gh/ydwu4/303/base -> origin/gh/ydwu4/303/base 2025-09-07T07:38:46.5163159Z * [new branch] gh/ydwu4/303/head -> origin/gh/ydwu4/303/head 2025-09-07T07:38:46.5163761Z * [new branch] gh/ydwu4/303/orig -> origin/gh/ydwu4/303/orig 2025-09-07T07:38:46.5164528Z * [new branch] gh/ydwu4/304/base -> origin/gh/ydwu4/304/base 2025-09-07T07:38:46.5164960Z * [new branch] gh/ydwu4/304/head -> origin/gh/ydwu4/304/head 2025-09-07T07:38:46.5165645Z * [new branch] gh/ydwu4/304/orig -> origin/gh/ydwu4/304/orig 2025-09-07T07:38:46.5166565Z * [new branch] gh/ydwu4/305/base -> origin/gh/ydwu4/305/base 2025-09-07T07:38:46.5166985Z * [new branch] gh/ydwu4/305/head -> origin/gh/ydwu4/305/head 2025-09-07T07:38:46.5167519Z * [new branch] gh/ydwu4/305/orig -> origin/gh/ydwu4/305/orig 2025-09-07T07:38:46.5168352Z * [new branch] gh/ydwu4/306/base -> origin/gh/ydwu4/306/base 2025-09-07T07:38:46.5168807Z * [new branch] gh/ydwu4/306/head -> origin/gh/ydwu4/306/head 2025-09-07T07:38:46.5169420Z * [new branch] gh/ydwu4/306/orig -> origin/gh/ydwu4/306/orig 2025-09-07T07:38:46.5170060Z * [new branch] gh/ydwu4/307/base -> origin/gh/ydwu4/307/base 2025-09-07T07:38:46.5170459Z * [new branch] gh/ydwu4/307/head -> origin/gh/ydwu4/307/head 2025-09-07T07:38:46.5171020Z * [new branch] gh/ydwu4/307/orig -> origin/gh/ydwu4/307/orig 2025-09-07T07:38:46.5171822Z * [new branch] gh/ydwu4/308/base -> origin/gh/ydwu4/308/base 2025-09-07T07:38:46.5172283Z * [new branch] gh/ydwu4/308/head -> origin/gh/ydwu4/308/head 2025-09-07T07:38:46.5172901Z * [new branch] gh/ydwu4/308/orig -> origin/gh/ydwu4/308/orig 2025-09-07T07:38:46.5173723Z * [new branch] gh/ydwu4/309/base -> origin/gh/ydwu4/309/base 2025-09-07T07:38:46.5174146Z * [new branch] gh/ydwu4/309/head -> origin/gh/ydwu4/309/head 2025-09-07T07:38:46.5174748Z * [new branch] gh/ydwu4/309/orig -> origin/gh/ydwu4/309/orig 2025-09-07T07:38:46.5175568Z * [new branch] gh/ydwu4/310/base -> origin/gh/ydwu4/310/base 2025-09-07T07:38:46.5176209Z * [new branch] gh/ydwu4/310/head -> origin/gh/ydwu4/310/head 2025-09-07T07:38:46.5176610Z * [new branch] gh/ydwu4/310/orig -> origin/gh/ydwu4/310/orig 2025-09-07T07:38:46.5177410Z * [new branch] gh/ydwu4/311/base -> origin/gh/ydwu4/311/base 2025-09-07T07:38:46.5177977Z * [new branch] gh/ydwu4/311/head -> origin/gh/ydwu4/311/head 2025-09-07T07:38:46.5178405Z * [new branch] gh/ydwu4/311/orig -> origin/gh/ydwu4/311/orig 2025-09-07T07:38:46.5179202Z * [new branch] gh/ydwu4/312/base -> origin/gh/ydwu4/312/base 2025-09-07T07:38:46.5179608Z * [new branch] gh/ydwu4/312/head -> origin/gh/ydwu4/312/head 2025-09-07T07:38:46.5180250Z * [new branch] gh/ydwu4/312/orig -> origin/gh/ydwu4/312/orig 2025-09-07T07:38:46.5181128Z * [new branch] gh/ydwu4/313/base -> origin/gh/ydwu4/313/base 2025-09-07T07:38:46.5181789Z * [new branch] gh/ydwu4/313/head -> origin/gh/ydwu4/313/head 2025-09-07T07:38:46.5182234Z * [new branch] gh/ydwu4/313/orig -> origin/gh/ydwu4/313/orig 2025-09-07T07:38:46.5183195Z * [new branch] gh/ydwu4/314/base -> origin/gh/ydwu4/314/base 2025-09-07T07:38:46.5183856Z * [new branch] gh/ydwu4/314/head -> origin/gh/ydwu4/314/head 2025-09-07T07:38:46.5184274Z * [new branch] gh/ydwu4/314/orig -> origin/gh/ydwu4/314/orig 2025-09-07T07:38:46.5185141Z * [new branch] gh/ydwu4/315/base -> origin/gh/ydwu4/315/base 2025-09-07T07:38:46.5185560Z * [new branch] gh/ydwu4/315/head -> origin/gh/ydwu4/315/head 2025-09-07T07:38:46.5186135Z * [new branch] gh/ydwu4/315/orig -> origin/gh/ydwu4/315/orig 2025-09-07T07:38:46.5186943Z * [new branch] gh/ydwu4/316/base -> origin/gh/ydwu4/316/base 2025-09-07T07:38:46.5187511Z * [new branch] gh/ydwu4/316/head -> origin/gh/ydwu4/316/head 2025-09-07T07:38:46.5188084Z * [new branch] gh/ydwu4/316/orig -> origin/gh/ydwu4/316/orig 2025-09-07T07:38:46.5188850Z * [new branch] gh/ydwu4/317/base -> origin/gh/ydwu4/317/base 2025-09-07T07:38:46.5189200Z * [new branch] gh/ydwu4/317/head -> origin/gh/ydwu4/317/head 2025-09-07T07:38:46.5189808Z * [new branch] gh/ydwu4/317/orig -> origin/gh/ydwu4/317/orig 2025-09-07T07:38:46.5190530Z * [new branch] gh/ydwu4/318/base -> origin/gh/ydwu4/318/base 2025-09-07T07:38:46.5190999Z * [new branch] gh/ydwu4/318/head -> origin/gh/ydwu4/318/head 2025-09-07T07:38:46.5191686Z * [new branch] gh/ydwu4/318/orig -> origin/gh/ydwu4/318/orig 2025-09-07T07:38:46.5192346Z * [new branch] gh/ydwu4/319/base -> origin/gh/ydwu4/319/base 2025-09-07T07:38:46.5192765Z * [new branch] gh/ydwu4/319/head -> origin/gh/ydwu4/319/head 2025-09-07T07:38:46.5193685Z * [new branch] gh/ydwu4/319/orig -> origin/gh/ydwu4/319/orig 2025-09-07T07:38:46.5194504Z * [new branch] gh/ydwu4/320/base -> origin/gh/ydwu4/320/base 2025-09-07T07:38:46.5194918Z * [new branch] gh/ydwu4/320/head -> origin/gh/ydwu4/320/head 2025-09-07T07:38:46.5195557Z * [new branch] gh/ydwu4/320/orig -> origin/gh/ydwu4/320/orig 2025-09-07T07:38:46.5196224Z * [new branch] gh/ydwu4/321/base -> origin/gh/ydwu4/321/base 2025-09-07T07:38:46.5196676Z * [new branch] gh/ydwu4/321/head -> origin/gh/ydwu4/321/head 2025-09-07T07:38:46.5197240Z * [new branch] gh/ydwu4/321/orig -> origin/gh/ydwu4/321/orig 2025-09-07T07:38:46.5197963Z * [new branch] gh/ydwu4/322/base -> origin/gh/ydwu4/322/base 2025-09-07T07:38:46.5198395Z * [new branch] gh/ydwu4/322/head -> origin/gh/ydwu4/322/head 2025-09-07T07:38:46.5199284Z * [new branch] gh/ydwu4/322/orig -> origin/gh/ydwu4/322/orig 2025-09-07T07:38:46.5200024Z * [new branch] gh/ydwu4/323/base -> origin/gh/ydwu4/323/base 2025-09-07T07:38:46.5200422Z * [new branch] gh/ydwu4/323/head -> origin/gh/ydwu4/323/head 2025-09-07T07:38:46.5201126Z * [new branch] gh/ydwu4/323/orig -> origin/gh/ydwu4/323/orig 2025-09-07T07:38:46.5201829Z * [new branch] gh/ydwu4/324/base -> origin/gh/ydwu4/324/base 2025-09-07T07:38:46.5202248Z * [new branch] gh/ydwu4/324/head -> origin/gh/ydwu4/324/head 2025-09-07T07:38:46.5202832Z * [new branch] gh/ydwu4/324/orig -> origin/gh/ydwu4/324/orig 2025-09-07T07:38:46.5203860Z * [new branch] gh/yf225/133/base -> origin/gh/yf225/133/base 2025-09-07T07:38:46.5204260Z * [new branch] gh/yf225/133/head -> origin/gh/yf225/133/head 2025-09-07T07:38:46.5205243Z * [new branch] gh/yf225/171/base -> origin/gh/yf225/171/base 2025-09-07T07:38:46.5205806Z * [new branch] gh/yf225/171/head -> origin/gh/yf225/171/head 2025-09-07T07:38:46.5206254Z * [new branch] gh/yf225/171/orig -> origin/gh/yf225/171/orig 2025-09-07T07:38:46.5207370Z * [new branch] gh/yf225/172/base -> origin/gh/yf225/172/base 2025-09-07T07:38:46.5207746Z * [new branch] gh/yf225/172/head -> origin/gh/yf225/172/head 2025-09-07T07:38:46.5208326Z * [new branch] gh/yf225/172/orig -> origin/gh/yf225/172/orig 2025-09-07T07:38:46.5209036Z * [new branch] gh/yf225/93/base -> origin/gh/yf225/93/base 2025-09-07T07:38:46.5209490Z * [new branch] gh/yf225/93/head -> origin/gh/yf225/93/head 2025-09-07T07:38:46.5210900Z * [new branch] gh/yifuwang/152/base -> origin/gh/yifuwang/152/base 2025-09-07T07:38:46.5211689Z * [new branch] gh/yifuwang/152/head -> origin/gh/yifuwang/152/head 2025-09-07T07:38:46.5212201Z * [new branch] gh/yifuwang/152/orig -> origin/gh/yifuwang/152/orig 2025-09-07T07:38:46.5213005Z * [new branch] gh/yifuwang/195/base -> origin/gh/yifuwang/195/base 2025-09-07T07:38:46.5213578Z * [new branch] gh/yifuwang/195/head -> origin/gh/yifuwang/195/head 2025-09-07T07:38:46.5213990Z * [new branch] gh/yifuwang/195/orig -> origin/gh/yifuwang/195/orig 2025-09-07T07:38:46.5215016Z * [new branch] gh/yiming0416/1/base -> origin/gh/yiming0416/1/base 2025-09-07T07:38:46.5215420Z * [new branch] gh/yiming0416/1/head -> origin/gh/yiming0416/1/head 2025-09-07T07:38:46.5216130Z * [new branch] gh/yiming0416/2/base -> origin/gh/yiming0416/2/base 2025-09-07T07:38:46.5216533Z * [new branch] gh/yiming0416/2/head -> origin/gh/yiming0416/2/head 2025-09-07T07:38:46.5217477Z * [new branch] gh/ysiraichi/79/base -> origin/gh/ysiraichi/79/base 2025-09-07T07:38:46.5218275Z * [new branch] gh/ysiraichi/79/head -> origin/gh/ysiraichi/79/head 2025-09-07T07:38:46.5218931Z * [new branch] gh/ysiraichi/79/orig -> origin/gh/ysiraichi/79/orig 2025-09-07T07:38:46.5219638Z * [new branch] gh/ysiraichi/88/base -> origin/gh/ysiraichi/88/base 2025-09-07T07:38:46.5220025Z * [new branch] gh/ysiraichi/88/head -> origin/gh/ysiraichi/88/head 2025-09-07T07:38:46.5220645Z * [new branch] gh/ysiraichi/88/orig -> origin/gh/ysiraichi/88/orig 2025-09-07T07:38:46.5221724Z * [new branch] gh/zhxchen17/25/base -> origin/gh/zhxchen17/25/base 2025-09-07T07:38:46.5222136Z * [new branch] gh/zhxchen17/25/head -> origin/gh/zhxchen17/25/head 2025-09-07T07:38:46.5222715Z * [new branch] gh/zhxchen17/25/orig -> origin/gh/zhxchen17/25/orig 2025-09-07T07:38:46.5223580Z * [new branch] gh/zhxchen17/31/base -> origin/gh/zhxchen17/31/base 2025-09-07T07:38:46.5224150Z * [new branch] gh/zhxchen17/31/head -> origin/gh/zhxchen17/31/head 2025-09-07T07:38:46.5224609Z * [new branch] gh/zhxchen17/31/orig -> origin/gh/zhxchen17/31/orig 2025-09-07T07:38:46.5225389Z * [new branch] gh/zhxchen17/34/base -> origin/gh/zhxchen17/34/base 2025-09-07T07:38:46.5225862Z * [new branch] gh/zhxchen17/34/head -> origin/gh/zhxchen17/34/head 2025-09-07T07:38:46.5226530Z * [new branch] gh/zhxchen17/35/base -> origin/gh/zhxchen17/35/base 2025-09-07T07:38:46.5226984Z * [new branch] gh/zhxchen17/35/head -> origin/gh/zhxchen17/35/head 2025-09-07T07:38:46.5227968Z * [new branch] gh/zhxchen17/37/base -> origin/gh/zhxchen17/37/base 2025-09-07T07:38:46.5228567Z * [new branch] gh/zhxchen17/37/head -> origin/gh/zhxchen17/37/head 2025-09-07T07:38:46.5228931Z * [new branch] gh/zhxchen17/37/orig -> origin/gh/zhxchen17/37/orig 2025-09-07T07:38:46.5229804Z * [new branch] gh/zhxchen17/38/base -> origin/gh/zhxchen17/38/base 2025-09-07T07:38:46.5230270Z * [new branch] gh/zhxchen17/38/head -> origin/gh/zhxchen17/38/head 2025-09-07T07:38:46.5231003Z * [new branch] gh/zhxchen17/38/orig -> origin/gh/zhxchen17/38/orig 2025-09-07T07:38:46.5231662Z * [new branch] gh/zhxchen17/39/base -> origin/gh/zhxchen17/39/base 2025-09-07T07:38:46.5232238Z * [new branch] gh/zhxchen17/39/head -> origin/gh/zhxchen17/39/head 2025-09-07T07:38:46.5232702Z * [new branch] gh/zhxchen17/39/orig -> origin/gh/zhxchen17/39/orig 2025-09-07T07:38:46.5233600Z * [new branch] gh/zhxchen17/40/base -> origin/gh/zhxchen17/40/base 2025-09-07T07:38:46.5234049Z * [new branch] gh/zhxchen17/40/head -> origin/gh/zhxchen17/40/head 2025-09-07T07:38:46.5234753Z * [new branch] gh/zhxchen17/40/orig -> origin/gh/zhxchen17/40/orig 2025-09-07T07:38:46.5235500Z * [new branch] gh/zhxchen17/41/base -> origin/gh/zhxchen17/41/base 2025-09-07T07:38:46.5236107Z * [new branch] gh/zhxchen17/41/head -> origin/gh/zhxchen17/41/head 2025-09-07T07:38:46.5236814Z * [new branch] gh/zhxchen17/41/orig -> origin/gh/zhxchen17/41/orig 2025-09-07T07:38:46.5237931Z * [new branch] gh/zhxchen17/42/base -> origin/gh/zhxchen17/42/base 2025-09-07T07:38:46.5238531Z * [new branch] gh/zhxchen17/42/head -> origin/gh/zhxchen17/42/head 2025-09-07T07:38:46.5239199Z * [new branch] gh/zhxchen17/42/orig -> origin/gh/zhxchen17/42/orig 2025-09-07T07:38:46.5240204Z * [new branch] gh/zhxchen17/43/base -> origin/gh/zhxchen17/43/base 2025-09-07T07:38:46.5240933Z * [new branch] gh/zhxchen17/43/head -> origin/gh/zhxchen17/43/head 2025-09-07T07:38:46.5241499Z * [new branch] gh/zhxchen17/43/orig -> origin/gh/zhxchen17/43/orig 2025-09-07T07:38:46.5242306Z * [new branch] gh/zhxchen17/44/base -> origin/gh/zhxchen17/44/base 2025-09-07T07:38:46.5242717Z * [new branch] gh/zhxchen17/44/head -> origin/gh/zhxchen17/44/head 2025-09-07T07:38:46.5243333Z * [new branch] gh/zhxchen17/44/orig -> origin/gh/zhxchen17/44/orig 2025-09-07T07:38:46.5244074Z * [new branch] gh/zhxchen17/45/base -> origin/gh/zhxchen17/45/base 2025-09-07T07:38:46.5244650Z * [new branch] gh/zhxchen17/45/head -> origin/gh/zhxchen17/45/head 2025-09-07T07:38:46.5245219Z * [new branch] gh/zhxchen17/45/orig -> origin/gh/zhxchen17/45/orig 2025-09-07T07:38:46.5246460Z * [new branch] gh/zklaus/10/base -> origin/gh/zklaus/10/base 2025-09-07T07:38:46.5246892Z * [new branch] gh/zklaus/10/head -> origin/gh/zklaus/10/head 2025-09-07T07:38:46.5247535Z * [new branch] gh/zklaus/10/orig -> origin/gh/zklaus/10/orig 2025-09-07T07:38:46.5248258Z * [new branch] gh/zklaus/11/base -> origin/gh/zklaus/11/base 2025-09-07T07:38:46.5248734Z * [new branch] gh/zklaus/11/head -> origin/gh/zklaus/11/head 2025-09-07T07:38:46.5249244Z * [new branch] gh/zklaus/11/orig -> origin/gh/zklaus/11/orig 2025-09-07T07:38:46.5250025Z * [new branch] gh/zklaus/12/base -> origin/gh/zklaus/12/base 2025-09-07T07:38:46.5250663Z * [new branch] gh/zklaus/12/head -> origin/gh/zklaus/12/head 2025-09-07T07:38:46.5251063Z * [new branch] gh/zklaus/12/orig -> origin/gh/zklaus/12/orig 2025-09-07T07:38:46.5251867Z * [new branch] gh/zklaus/14/base -> origin/gh/zklaus/14/base 2025-09-07T07:38:46.5252268Z * [new branch] gh/zklaus/14/head -> origin/gh/zklaus/14/head 2025-09-07T07:38:46.5252873Z * [new branch] gh/zklaus/14/orig -> origin/gh/zklaus/14/orig 2025-09-07T07:38:46.5253957Z * [new branch] gh/zklaus/15/base -> origin/gh/zklaus/15/base 2025-09-07T07:38:46.5254390Z * [new branch] gh/zklaus/15/head -> origin/gh/zklaus/15/head 2025-09-07T07:38:46.5255048Z * [new branch] gh/zklaus/15/orig -> origin/gh/zklaus/15/orig 2025-09-07T07:38:46.5255810Z * [new branch] gh/zklaus/16/base -> origin/gh/zklaus/16/base 2025-09-07T07:38:46.5256225Z * [new branch] gh/zklaus/16/head -> origin/gh/zklaus/16/head 2025-09-07T07:38:46.5256788Z * [new branch] gh/zklaus/16/orig -> origin/gh/zklaus/16/orig 2025-09-07T07:38:46.5257536Z * [new branch] gh/zklaus/17/base -> origin/gh/zklaus/17/base 2025-09-07T07:38:46.5257959Z * [new branch] gh/zklaus/17/head -> origin/gh/zklaus/17/head 2025-09-07T07:38:46.5258874Z * [new branch] gh/zklaus/17/orig -> origin/gh/zklaus/17/orig 2025-09-07T07:38:46.5259504Z * [new branch] gh/zklaus/18/base -> origin/gh/zklaus/18/base 2025-09-07T07:38:46.5260108Z * [new branch] gh/zklaus/18/head -> origin/gh/zklaus/18/head 2025-09-07T07:38:46.5260527Z * [new branch] gh/zklaus/18/orig -> origin/gh/zklaus/18/orig 2025-09-07T07:38:46.5261320Z * [new branch] gh/zklaus/19/base -> origin/gh/zklaus/19/base 2025-09-07T07:38:46.5261766Z * [new branch] gh/zklaus/19/head -> origin/gh/zklaus/19/head 2025-09-07T07:38:46.5262402Z * [new branch] gh/zklaus/19/orig -> origin/gh/zklaus/19/orig 2025-09-07T07:38:46.5263062Z * [new branch] gh/zklaus/20/base -> origin/gh/zklaus/20/base 2025-09-07T07:38:46.5263847Z * [new branch] gh/zklaus/20/head -> origin/gh/zklaus/20/head 2025-09-07T07:38:46.5264277Z * [new branch] gh/zklaus/20/orig -> origin/gh/zklaus/20/orig 2025-09-07T07:38:46.5265070Z * [new branch] gh/zklaus/7/base -> origin/gh/zklaus/7/base 2025-09-07T07:38:46.5265490Z * [new branch] gh/zklaus/7/head -> origin/gh/zklaus/7/head 2025-09-07T07:38:46.5266058Z * [new branch] gh/zklaus/7/orig -> origin/gh/zklaus/7/orig 2025-09-07T07:38:46.5266790Z * [new branch] gh/zklaus/9/base -> origin/gh/zklaus/9/base 2025-09-07T07:38:46.5267191Z * [new branch] gh/zklaus/9/head -> origin/gh/zklaus/9/head 2025-09-07T07:38:46.5267765Z * [new branch] gh/zklaus/9/orig -> origin/gh/zklaus/9/orig 2025-09-07T07:38:46.5268715Z * [new branch] gh/zou3519/1175/base -> origin/gh/zou3519/1175/base 2025-09-07T07:38:46.5269427Z * [new branch] gh/zou3519/1175/head -> origin/gh/zou3519/1175/head 2025-09-07T07:38:46.5269853Z * [new branch] gh/zou3519/1175/orig -> origin/gh/zou3519/1175/orig 2025-09-07T07:38:46.5270640Z * [new branch] gh/zou3519/1177/base -> origin/gh/zou3519/1177/base 2025-09-07T07:38:46.5271112Z * [new branch] gh/zou3519/1177/head -> origin/gh/zou3519/1177/head 2025-09-07T07:38:46.5271745Z * [new branch] gh/zou3519/1177/orig -> origin/gh/zou3519/1177/orig 2025-09-07T07:38:46.5272474Z * [new branch] gh/zou3519/1191/base -> origin/gh/zou3519/1191/base 2025-09-07T07:38:46.5273053Z * [new branch] gh/zou3519/1191/head -> origin/gh/zou3519/1191/head 2025-09-07T07:38:46.5273500Z * [new branch] gh/zou3519/1191/orig -> origin/gh/zou3519/1191/orig 2025-09-07T07:38:46.5274327Z * [new branch] gh/zou3519/1192/base -> origin/gh/zou3519/1192/base 2025-09-07T07:38:46.5274795Z * [new branch] gh/zou3519/1192/head -> origin/gh/zou3519/1192/head 2025-09-07T07:38:46.5275359Z * [new branch] gh/zou3519/1192/orig -> origin/gh/zou3519/1192/orig 2025-09-07T07:38:46.5276267Z * [new branch] gh/zou3519/1193/base -> origin/gh/zou3519/1193/base 2025-09-07T07:38:46.5276837Z * [new branch] gh/zou3519/1193/head -> origin/gh/zou3519/1193/head 2025-09-07T07:38:46.5277245Z * [new branch] gh/zou3519/1193/orig -> origin/gh/zou3519/1193/orig 2025-09-07T07:38:46.5277996Z * [new branch] gh/zou3519/1194/base -> origin/gh/zou3519/1194/base 2025-09-07T07:38:46.5278688Z * [new branch] gh/zou3519/1194/head -> origin/gh/zou3519/1194/head 2025-09-07T07:38:46.5279287Z * [new branch] gh/zou3519/1194/orig -> origin/gh/zou3519/1194/orig 2025-09-07T07:38:46.5279986Z * [new branch] gh/zou3519/1195/base -> origin/gh/zou3519/1195/base 2025-09-07T07:38:46.5280622Z * [new branch] gh/zou3519/1195/head -> origin/gh/zou3519/1195/head 2025-09-07T07:38:46.5281097Z * [new branch] gh/zou3519/1195/orig -> origin/gh/zou3519/1195/orig 2025-09-07T07:38:46.5281827Z * [new branch] gh/zou3519/1196/base -> origin/gh/zou3519/1196/base 2025-09-07T07:38:46.5282269Z * [new branch] gh/zou3519/1196/head -> origin/gh/zou3519/1196/head 2025-09-07T07:38:46.5282886Z * [new branch] gh/zou3519/1196/orig -> origin/gh/zou3519/1196/orig 2025-09-07T07:38:46.5283776Z * [new branch] gh/zou3519/1197/base -> origin/gh/zou3519/1197/base 2025-09-07T07:38:46.5284403Z * [new branch] gh/zou3519/1197/head -> origin/gh/zou3519/1197/head 2025-09-07T07:38:46.5284864Z * [new branch] gh/zou3519/1197/orig -> origin/gh/zou3519/1197/orig 2025-09-07T07:38:46.5286210Z * [new branch] gh/zpcore/1/base -> origin/gh/zpcore/1/base 2025-09-07T07:38:46.5286631Z * [new branch] gh/zpcore/1/head -> origin/gh/zpcore/1/head 2025-09-07T07:38:46.5287768Z * [new branch] gh/zpcore/10/base -> origin/gh/zpcore/10/base 2025-09-07T07:38:46.5288339Z * [new branch] gh/zpcore/10/head -> origin/gh/zpcore/10/head 2025-09-07T07:38:46.5288755Z * [new branch] gh/zpcore/10/orig -> origin/gh/zpcore/10/orig 2025-09-07T07:38:46.5289584Z * [new branch] gh/zpcore/11/base -> origin/gh/zpcore/11/base 2025-09-07T07:38:46.5290139Z * [new branch] gh/zpcore/11/head -> origin/gh/zpcore/11/head 2025-09-07T07:38:46.5290577Z * [new branch] gh/zpcore/11/orig -> origin/gh/zpcore/11/orig 2025-09-07T07:38:46.5291528Z * [new branch] gh/zpcore/12/base -> origin/gh/zpcore/12/base 2025-09-07T07:38:46.5292233Z * [new branch] gh/zpcore/12/head -> origin/gh/zpcore/12/head 2025-09-07T07:38:46.5292824Z * [new branch] gh/zpcore/12/orig -> origin/gh/zpcore/12/orig 2025-09-07T07:38:46.5293560Z * [new branch] gh/zpcore/13/base -> origin/gh/zpcore/13/base 2025-09-07T07:38:46.5294035Z * [new branch] gh/zpcore/13/head -> origin/gh/zpcore/13/head 2025-09-07T07:38:46.5294600Z * [new branch] gh/zpcore/13/orig -> origin/gh/zpcore/13/orig 2025-09-07T07:38:46.5295341Z * [new branch] gh/zpcore/14/base -> origin/gh/zpcore/14/base 2025-09-07T07:38:46.5295758Z * [new branch] gh/zpcore/14/head -> origin/gh/zpcore/14/head 2025-09-07T07:38:46.5296570Z * [new branch] gh/zpcore/2/base -> origin/gh/zpcore/2/base 2025-09-07T07:38:46.5297009Z * [new branch] gh/zpcore/2/head -> origin/gh/zpcore/2/head 2025-09-07T07:38:46.5297824Z * [new branch] gh/zpcore/3/base -> origin/gh/zpcore/3/base 2025-09-07T07:38:46.5298220Z * [new branch] gh/zpcore/3/head -> origin/gh/zpcore/3/head 2025-09-07T07:38:46.5298982Z * [new branch] gh/zpcore/4/base -> origin/gh/zpcore/4/base 2025-09-07T07:38:46.5299374Z * [new branch] gh/zpcore/4/head -> origin/gh/zpcore/4/head 2025-09-07T07:38:46.5300108Z * [new branch] gh/zpcore/5/base -> origin/gh/zpcore/5/base 2025-09-07T07:38:46.5300494Z * [new branch] gh/zpcore/5/head -> origin/gh/zpcore/5/head 2025-09-07T07:38:46.5301170Z * [new branch] gh/zpcore/6/base -> origin/gh/zpcore/6/base 2025-09-07T07:38:46.5301641Z * [new branch] gh/zpcore/6/head -> origin/gh/zpcore/6/head 2025-09-07T07:38:46.5302306Z * [new branch] gh/zpcore/7/base -> origin/gh/zpcore/7/base 2025-09-07T07:38:46.5302666Z * [new branch] gh/zpcore/7/head -> origin/gh/zpcore/7/head 2025-09-07T07:38:46.5303821Z * [new branch] gh/zpcore/8/base -> origin/gh/zpcore/8/base 2025-09-07T07:38:46.5304202Z * [new branch] gh/zpcore/8/head -> origin/gh/zpcore/8/head 2025-09-07T07:38:46.5305007Z * [new branch] google-main -> origin/google-main 2025-09-07T07:38:46.5305879Z * [new branch] guangyey/external_stream -> origin/guangyey/external_stream 2025-09-07T07:38:46.5306254Z * [new branch] guangyey/host_alloc -> origin/guangyey/host_alloc 2025-09-07T07:38:46.5306870Z * [new branch] guangyey/reimport -> origin/guangyey/reimport 2025-09-07T07:38:46.5307260Z * [new branch] guangyey/test_2025 -> origin/guangyey/test_2025 2025-09-07T07:38:46.5308166Z * [new branch] guilhermeleobas/cherry-pick-55d87d9dfd9 -> origin/guilhermeleobas/cherry-pick-55d87d9dfd9 2025-09-07T07:38:46.5308809Z * [new branch] haozhe/bf16-dynamic-shape -> origin/haozhe/bf16-dynamic-shape 2025-09-07T07:38:46.5309235Z * [new branch] hc_baseline -> origin/hc_baseline 2025-09-07T07:38:46.5309877Z * [new branch] hf_update -> origin/hf_update 2025-09-07T07:38:46.5310292Z * [new branch] hhh_decomp_mul -> origin/hhh_decomp_mul 2025-09-07T07:38:46.5310894Z * [new branch] hhh_rand -> origin/hhh_rand 2025-09-07T07:38:46.5311617Z * [new branch] hoy/mmsplitk -> origin/hoy/mmsplitk 2025-09-07T07:38:46.5312010Z * [new branch] hoy/triton-PR3973 -> origin/hoy/triton-PR3973 2025-09-07T07:38:46.5312706Z * [new branch] hoy/triton-coalescing-baseline -> origin/hoy/triton-coalescing-baseline 2025-09-07T07:38:46.5313080Z * [new branch] hoy/triton-coalescing-new -> origin/hoy/triton-coalescing-new 2025-09-07T07:38:46.5313579Z * [new branch] hoy/triton-coalescing-vec -> origin/hoy/triton-coalescing-vec 2025-09-07T07:38:46.5314168Z * [new branch] inductordecompfix -> origin/inductordecompfix 2025-09-07T07:38:46.5314631Z * [new branch] inline -> origin/inline 2025-09-07T07:38:46.5315341Z * [new branch] inlining -> origin/inlining 2025-09-07T07:38:46.5315793Z * [new branch] inlining-ezyang -> origin/inlining-ezyang 2025-09-07T07:38:46.5316421Z * [new branch] install-torchao-0.13.0 -> origin/install-torchao-0.13.0 2025-09-07T07:38:46.5316832Z * [new branch] int8_sdpa -> origin/int8_sdpa 2025-09-07T07:38:46.5317518Z * [new branch] invoke-subgraph -> origin/invoke-subgraph 2025-09-07T07:38:46.5317922Z * [new branch] issue#58739 -> origin/issue#58739 2025-09-07T07:38:46.5318956Z * [new branch] jcaip/test-cusparselt-version-0.6.2 -> origin/jcaip/test-cusparselt-version-0.6.2 2025-09-07T07:38:46.5319350Z * [new branch] jcaip/update-cusparselt-0.6.2 -> origin/jcaip/update-cusparselt-0.6.2 2025-09-07T07:38:46.5320172Z * [new branch] jeanschmidt/disable_rocm_build_tests -> origin/jeanschmidt/disable_rocm_build_tests 2025-09-07T07:38:46.5320655Z * [new branch] jithunnair-amd-patch-1 -> origin/jithunnair-amd-patch-1 2025-09-07T07:38:46.5321392Z * [new branch] jithunnair-amd-patch-2 -> origin/jithunnair-amd-patch-2 2025-09-07T07:38:46.5322078Z * [new branch] justinchu/attention-tests -> origin/justinchu/attention-tests 2025-09-07T07:38:46.5322504Z * [new branch] justinchu/native-qdq -> origin/justinchu/native-qdq 2025-09-07T07:38:46.5323106Z * [new branch] justinchu/ort-122 -> origin/justinchu/ort-122 2025-09-07T07:38:46.5323850Z * [new branch] justinchuby/dynamo-true -> origin/justinchuby/dynamo-true 2025-09-07T07:38:46.5324744Z * [new branch] kainan666/xlf_debug -> origin/kainan666/xlf_debug 2025-09-07T07:38:46.5325169Z * [new branch] kainan_test -> origin/kainan_test 2025-09-07T07:38:46.5325735Z * [new branch] learnablebias -> origin/learnablebias 2025-09-07T07:38:46.5326462Z * [new branch] leslie/test_group_gemm_epilogues -> origin/leslie/test_group_gemm_epilogues 2025-09-07T07:38:46.5327190Z * [new branch] lessw2020/fix_cutlass_cache_error -> origin/lessw2020/fix_cutlass_cache_error 2025-09-07T07:38:46.5327945Z * [new branch] liaoxuan/shm_all_reduce -> origin/liaoxuan/shm_all_reduce 2025-09-07T07:38:46.5328388Z * [new branch] liaoxuan/test_fa_disable_softmax -> origin/liaoxuan/test_fa_disable_softmax 2025-09-07T07:38:46.5328825Z * [new branch] liaoxuan/test_int8_sdpa -> origin/liaoxuan/test_int8_sdpa 2025-09-07T07:38:46.5329412Z * [new branch] lintbuilddocker -> origin/lintbuilddocker 2025-09-07T07:38:46.5329780Z * [new branch] llama4-stable -> origin/llama4-stable 2025-09-07T07:38:46.5330459Z * [new branch] logdetfix -> origin/logdetfix 2025-09-07T07:38:46.5331403Z * [new branch] lts/release/1.8 -> origin/lts/release/1.8 2025-09-07T07:38:46.5332139Z * [new branch] lucaskabela/#94773 -> origin/lucaskabela/#94773 2025-09-07T07:38:46.5332548Z * [new branch] lucaskabela/flop_counter -> origin/lucaskabela/flop_counter 2025-09-07T07:38:46.5333245Z * [new branch] lucaskabela/func_under_decomp -> origin/lucaskabela/func_under_decomp 2025-09-07T07:38:46.5333676Z * [new branch] lucaskabela/functional_in_dynamo -> origin/lucaskabela/functional_in_dynamo 2025-09-07T07:38:46.5334242Z * [new branch] lucaskabela/install_params_as_graph_attr -> origin/lucaskabela/install_params_as_graph_attr 2025-09-07T07:38:46.5334672Z * [new branch] lucaskabela/issue_120648 -> origin/lucaskabela/issue_120648 2025-09-07T07:38:46.5335499Z * [new branch] lucaskabela/misc_typing_dynamo -> origin/lucaskabela/misc_typing_dynamo 2025-09-07T07:38:46.5336229Z * [new branch] lucaskabela/parameters_as_graph_attr -> origin/lucaskabela/parameters_as_graph_attr 2025-09-07T07:38:46.5336730Z * [new branch] lucaskabela/remove_aot_dispatcher_metadata -> origin/lucaskabela/remove_aot_dispatcher_metadata 2025-09-07T07:38:46.5337313Z * [new branch] lucaskabela/rnn_decomp -> origin/lucaskabela/rnn_decomp 2025-09-07T07:38:46.5337738Z * [new branch] lucaskabela/typing_backends -> origin/lucaskabela/typing_backends 2025-09-07T07:38:46.5338376Z * [new branch] lucaskabela/typing_symbolic_convert -> origin/lucaskabela/typing_symbolic_convert 2025-09-07T07:38:46.5338798Z * [new branch] lucaskabela/typing_utils_improvements -> origin/lucaskabela/typing_utils_improvements 2025-09-07T07:38:46.5339358Z * [new branch] main -> origin/main 2025-09-07T07:38:46.5340047Z * [new branch] main-enable-b200-distributed-tests -> origin/main-enable-b200-distributed-tests 2025-09-07T07:38:46.5340455Z * [new branch] malfet-patch-1 -> origin/malfet-patch-1 2025-09-07T07:38:46.5341155Z * [new branch] malfet-patch-12 -> origin/malfet-patch-12 2025-09-07T07:38:46.5341880Z * [new branch] malfet-patch-14 -> origin/malfet-patch-14 2025-09-07T07:38:46.5342822Z * [new branch] malfet-patch-6 -> origin/malfet-patch-6 2025-09-07T07:38:46.5343394Z * [new branch] malfet-patch-8 -> origin/malfet-patch-8 2025-09-07T07:38:46.5344321Z * [new branch] malfet/be-move-more-settings-to-checkout-pytorch -> origin/malfet/be-move-more-settings-to-checkout-pytorch 2025-09-07T07:38:46.5344765Z * [new branch] malfet/delete-upsteam-cuda -> origin/malfet/delete-upsteam-cuda 2025-09-07T07:38:46.5345219Z * [new branch] malfet/mps-implement-col2im -> origin/malfet/mps-implement-col2im 2025-09-07T07:38:46.5346011Z * [new branch] manuel/test-ops-common-allow-mps -> origin/manuel/test-ops-common-allow-mps 2025-09-07T07:38:46.5346448Z * [new branch] metascroy-patch-1 -> origin/metascroy-patch-1 2025-09-07T07:38:46.5347259Z * [new branch] mlazos/S429861-debug -> origin/mlazos/S429861-debug 2025-09-07T07:38:46.5347642Z * [new branch] mlazos/aa -> origin/mlazos/aa 2025-09-07T07:38:46.5348253Z * [new branch] mlazos/arg-renames -> origin/mlazos/arg-renames 2025-09-07T07:38:46.5348700Z * [new branch] mlazos/backup-test-branch -> origin/mlazos/backup-test-branch 2025-09-07T07:38:46.5349172Z * [new branch] mlazos/bad-cudagraphs -> origin/mlazos/bad-cudagraphs 2025-09-07T07:38:46.5349740Z * [new branch] mlazos/baseline -> origin/mlazos/baseline 2025-09-07T07:38:46.5350162Z * [new branch] mlazos/baseline-graph-breaks -> origin/mlazos/baseline-graph-breaks 2025-09-07T07:38:46.5350901Z * [new branch] mlazos/beta-tensor -> origin/mlazos/beta-tensor 2025-09-07T07:38:46.5351554Z * [new branch] mlazos/better-msg -> origin/mlazos/better-msg 2025-09-07T07:38:46.5352215Z * [new branch] mlazos/buffers -> origin/mlazos/buffers 2025-09-07T07:38:46.5352668Z * [new branch] mlazos/buffers2 -> origin/mlazos/buffers2 2025-09-07T07:38:46.5353278Z * [new branch] mlazos/buffers3 -> origin/mlazos/buffers3 2025-09-07T07:38:46.5354196Z * [new branch] mlazos/ck2 -> origin/mlazos/ck2 2025-09-07T07:38:46.5354791Z * [new branch] mlazos/combokernels -> origin/mlazos/combokernels 2025-09-07T07:38:46.5355251Z * [new branch] mlazos/ctx-cleanup -> origin/mlazos/ctx-cleanup 2025-09-07T07:38:46.5355691Z * [new branch] mlazos/cuda-cmd-log -> origin/mlazos/cuda-cmd-log 2025-09-07T07:38:46.5356307Z * [new branch] mlazos/cudagraph-tests -> origin/mlazos/cudagraph-tests 2025-09-07T07:38:46.5356733Z * [new branch] mlazos/cudagraphs-measurement -> origin/mlazos/cudagraphs-measurement 2025-09-07T07:38:46.5357419Z * [new branch] mlazos/cutlass-test -> origin/mlazos/cutlass-test 2025-09-07T07:38:46.5357906Z * [new branch] mlazos/cutlass-topo-bug -> origin/mlazos/cutlass-topo-bug 2025-09-07T07:38:46.5358373Z * [new branch] mlazos/data-gather -> origin/mlazos/data-gather 2025-09-07T07:38:46.5358960Z * [new branch] mlazos/data-ptrs2 -> origin/mlazos/data-ptrs2 2025-09-07T07:38:46.5359547Z * [new branch] mlazos/data-ptrs3 -> origin/mlazos/data-ptrs3 2025-09-07T07:38:46.5360012Z * [new branch] mlazos/dataclass-proxy -> origin/mlazos/dataclass-proxy 2025-09-07T07:38:46.5360582Z * [new branch] mlazos/dc-attrs -> origin/mlazos/dc-attrs 2025-09-07T07:38:46.5361000Z * [new branch] mlazos/dc-helion -> origin/mlazos/dc-helion 2025-09-07T07:38:46.5361605Z * [new branch] mlazos/dict-fix -> origin/mlazos/dict-fix 2025-09-07T07:38:46.5362213Z * [new branch] mlazos/disable-closures -> origin/mlazos/disable-closures 2025-09-07T07:38:46.5362931Z * [new branch] mlazos/disable-tf -> origin/mlazos/disable-tf 2025-09-07T07:38:46.5363673Z * [new branch] mlazos/dupe-fix -> origin/mlazos/dupe-fix 2025-09-07T07:38:46.5364255Z * [new branch] mlazos/dyn-batch -> origin/mlazos/dyn-batch 2025-09-07T07:38:46.5364757Z * [new branch] mlazos/evt -> origin/mlazos/evt 2025-09-07T07:38:46.5365337Z * [new branch] mlazos/exp_disable -> origin/mlazos/exp_disable 2025-09-07T07:38:46.5365799Z * [new branch] mlazos/extract-examples -> origin/mlazos/extract-examples 2025-09-07T07:38:46.5366378Z * [new branch] mlazos/foreach-op -> origin/mlazos/foreach-op 2025-09-07T07:38:46.5366789Z * [new branch] mlazos/fp8 -> origin/mlazos/fp8 2025-09-07T07:38:46.5367358Z * [new branch] mlazos/fp8-bias -> origin/mlazos/fp8-bias 2025-09-07T07:38:46.5368022Z * [new branch] mlazos/fp8-bias-fusion -> origin/mlazos/fp8-bias-fusion 2025-09-07T07:38:46.5368340Z * [new branch] mlazos/fp8-fixes -> origin/mlazos/fp8-fixes 2025-09-07T07:38:46.5368949Z * [new branch] mlazos/freezing -> origin/mlazos/freezing 2025-09-07T07:38:46.5369372Z * [new branch] mlazos/h-comp -> origin/mlazos/h-comp 2025-09-07T07:38:46.5369948Z * [new branch] mlazos/h-comp2 -> origin/mlazos/h-comp2 2025-09-07T07:38:46.5370551Z * [new branch] mlazos/hash-hop -> origin/mlazos/hash-hop 2025-09-07T07:38:46.5371078Z * [new branch] mlazos/hc -> origin/mlazos/hc 2025-09-07T07:38:46.5371620Z * [new branch] mlazos/hc-cycles -> origin/mlazos/hc-cycles 2025-09-07T07:38:46.5372148Z * [new branch] mlazos/hc-fixes -> origin/mlazos/hc-fixes 2025-09-07T07:38:46.5372609Z * [new branch] mlazos/hc-fixes3 -> origin/mlazos/hc-fixes3 2025-09-07T07:38:46.5373139Z * [new branch] mlazos/hc-fixes4 -> origin/mlazos/hc-fixes4 2025-09-07T07:38:46.5373616Z * [new branch] mlazos/hc-hf -> origin/mlazos/hc-hf 2025-09-07T07:38:46.5374135Z * [new branch] mlazos/hc-mut -> origin/mlazos/hc-mut 2025-09-07T07:38:46.5374676Z * [new branch] mlazos/hc10 -> origin/mlazos/hc10 2025-09-07T07:38:46.5375294Z * [new branch] mlazos/hc11 -> origin/mlazos/hc11 2025-09-07T07:38:46.5375808Z * [new branch] mlazos/hc12 -> origin/mlazos/hc12 2025-09-07T07:38:46.5376295Z * [new branch] mlazos/hc13 -> origin/mlazos/hc13 2025-09-07T07:38:46.5376794Z * [new branch] mlazos/hc14 -> origin/mlazos/hc14 2025-09-07T07:38:46.5377248Z * [new branch] mlazos/hc15 -> origin/mlazos/hc15 2025-09-07T07:38:46.5377797Z * [new branch] mlazos/hc2 -> origin/mlazos/hc2 2025-09-07T07:38:46.5378303Z * [new branch] mlazos/hc4 -> origin/mlazos/hc4 2025-09-07T07:38:46.5378891Z * [new branch] mlazos/hc5 -> origin/mlazos/hc5 2025-09-07T07:38:46.5379403Z * [new branch] mlazos/hc6 -> origin/mlazos/hc6 2025-09-07T07:38:46.5379896Z * [new branch] mlazos/hc7 -> origin/mlazos/hc7 2025-09-07T07:38:46.5380340Z * [new branch] mlazos/hc8 -> origin/mlazos/hc8 2025-09-07T07:38:46.5380861Z * [new branch] mlazos/hc9 -> origin/mlazos/hc9 2025-09-07T07:38:46.5381449Z * [new branch] mlazos/hc_baseline2 -> origin/mlazos/hc_baseline2 2025-09-07T07:38:46.5382054Z * [new branch] mlazos/init-per-param -> origin/mlazos/init-per-param 2025-09-07T07:38:46.5382435Z * [new branch] mlazos/init_per_param -> origin/mlazos/init_per_param 2025-09-07T07:38:46.5383136Z * [new branch] mlazos/less-guards -> origin/mlazos/less-guards 2025-09-07T07:38:46.5383611Z * [new branch] mlazos/lr-composibility -> origin/mlazos/lr-composibility 2025-09-07T07:38:46.5384096Z * [new branch] mlazos/main -> origin/mlazos/main 2025-09-07T07:38:46.5384587Z * [new branch] mlazos/main-test-enablement -> origin/mlazos/main-test-enablement 2025-09-07T07:38:46.5385155Z * [new branch] mlazos/main2 -> origin/mlazos/main2 2025-09-07T07:38:46.5385744Z * [new branch] mlazos/mark-static-update -> origin/mlazos/mark-static-update 2025-09-07T07:38:46.5386151Z * [new branch] mlazos/mcg -> origin/mlazos/mcg 2025-09-07T07:38:46.5386710Z * [new branch] mlazos/mcg2 -> origin/mlazos/mcg2 2025-09-07T07:38:46.5387333Z * [new branch] mlazos/meta-guards -> origin/mlazos/meta-guards 2025-09-07T07:38:46.5387991Z * [new branch] mlazos/mlazos/ck2 -> origin/mlazos/mlazos/ck2 2025-09-07T07:38:46.5388480Z * [new branch] mlazos/mlazos/foreach-map-adam -> origin/mlazos/mlazos/foreach-map-adam 2025-09-07T07:38:46.5389084Z * [new branch] mlazos/mlazos/tf-mode-backup -> origin/mlazos/mlazos/tf-mode-backup 2025-09-07T07:38:46.5389507Z * [new branch] mlazos/mod-fix -> origin/mlazos/mod-fix 2025-09-07T07:38:46.5390118Z * [new branch] mlazos/mode-fix -> origin/mlazos/mode-fix 2025-09-07T07:38:46.5390546Z * [new branch] mlazos/more-tests -> origin/mlazos/more-tests 2025-09-07T07:38:46.5391262Z * [new branch] mlazos/no-cpp -> origin/mlazos/no-cpp 2025-09-07T07:38:46.5391882Z * [new branch] mlazos/no-init-group-handling -> origin/mlazos/no-init-group-handling 2025-09-07T07:38:46.5392275Z * [new branch] mlazos/offsets -> origin/mlazos/offsets 2025-09-07T07:38:46.5392792Z * [new branch] mlazos/opt-bench-exp2 -> origin/mlazos/opt-bench-exp2 2025-09-07T07:38:46.5393261Z * [new branch] mlazos/opt-incr -> origin/mlazos/opt-incr 2025-09-07T07:38:46.5393845Z * [new branch] mlazos/proxy-ctors -> origin/mlazos/proxy-ctors 2025-09-07T07:38:46.5394253Z * [new branch] mlazos/quant-fix -> origin/mlazos/quant-fix 2025-09-07T07:38:46.5394874Z * [new branch] mlazos/resnet-fix -> origin/mlazos/resnet-fix 2025-09-07T07:38:46.5395459Z * [new branch] mlazos/revert-inline -> origin/mlazos/revert-inline 2025-09-07T07:38:46.5395927Z * [new branch] mlazos/rm-buf-names -> origin/mlazos/rm-buf-names 2025-09-07T07:38:46.5396337Z * [new branch] mlazos/rm-code -> origin/mlazos/rm-code 2025-09-07T07:38:46.5396877Z * [new branch] mlazos/rm-spam -> origin/mlazos/rm-spam 2025-09-07T07:38:46.5397436Z * [new branch] mlazos/rtp -> origin/mlazos/rtp 2025-09-07T07:38:46.5397914Z * [new branch] mlazos/static-idx-dbg -> origin/mlazos/static-idx-dbg 2025-09-07T07:38:46.5398819Z * [new branch] mlazos/static-inputs-log -> origin/mlazos/static-inputs-log 2025-09-07T07:38:46.5399774Z * [new branch] mlazos/sub-param-fix -> origin/mlazos/sub-param-fix 2025-09-07T07:38:46.5400186Z * [new branch] mlazos/td-fix2 -> origin/mlazos/td-fix2 2025-09-07T07:38:46.5400894Z * [new branch] mlazos/tensor-hasattr2 -> origin/mlazos/tensor-hasattr2 2025-09-07T07:38:46.5401192Z * [new branch] mlazos/test -> origin/mlazos/test 2025-09-07T07:38:46.5401773Z * [new branch] mlazos/tf-mode -> origin/mlazos/tf-mode 2025-09-07T07:38:46.5402365Z * [new branch] mlazos/tf-mode-backup2 -> origin/mlazos/tf-mode-backup2 2025-09-07T07:38:46.5402802Z * [new branch] mlazos/tf-mode-reland -> origin/mlazos/tf-mode-reland 2025-09-07T07:38:46.5403543Z * [new branch] mlazos/tf-mode-reland2 -> origin/mlazos/tf-mode-reland2 2025-09-07T07:38:46.5404059Z * [new branch] mlazos/tf-mode-reland3 -> origin/mlazos/tf-mode-reland3 2025-09-07T07:38:46.5404627Z * [new branch] mlazos/topo-fix -> origin/mlazos/topo-fix 2025-09-07T07:38:46.5405058Z * [new branch] mlazos/triton-no-epi -> origin/mlazos/triton-no-epi 2025-09-07T07:38:46.5405635Z * [new branch] mlazos/tune-proto -> origin/mlazos/tune-proto 2025-09-07T07:38:46.5406040Z * [new branch] mlazos/tuple-fixes -> origin/mlazos/tuple-fixes 2025-09-07T07:38:46.5406479Z * [new branch] mlazos/tuple-fixes2 -> origin/mlazos/tuple-fixes2 2025-09-07T07:38:46.5407067Z * [new branch] mlazos/tuple-handling -> origin/mlazos/tuple-handling 2025-09-07T07:38:46.5407699Z * [new branch] mlazos/user-streams -> origin/mlazos/user-streams 2025-09-07T07:38:46.5408168Z * [new branch] mlazos/vary-beta -> origin/mlazos/vary-beta 2025-09-07T07:38:46.5408701Z * [new branch] mlazos/vary-beta2 -> origin/mlazos/vary-beta2 2025-09-07T07:38:46.5409241Z * [new branch] mlazos/weird-perf1 -> origin/mlazos/weird-perf1 2025-09-07T07:38:46.5409925Z * [new branch] mm_out_dtype_compile -> origin/mm_out_dtype_compile 2025-09-07T07:38:46.5410548Z * [new branch] modify-setupvllm -> origin/modify-setupvllm 2025-09-07T07:38:46.5411007Z * [new branch] module-shim -> origin/module-shim 2025-09-07T07:38:46.5411646Z * [new branch] move-theme-out-docker -> origin/move-theme-out-docker 2025-09-07T07:38:46.5412485Z * [new branch] msaroufim/be1 -> origin/msaroufim/be1 2025-09-07T07:38:46.5413114Z * [new branch] msaroufim/cn_path -> origin/msaroufim/cn_path 2025-09-07T07:38:46.5413620Z * [new branch] msaroufim/dtensorfusedadam -> origin/msaroufim/dtensorfusedadam 2025-09-07T07:38:46.5414205Z * [new branch] msaroufim/reduce -> origin/msaroufim/reduce 2025-09-07T07:38:46.5414952Z * [new branch] mtia/basic-cmake -> origin/mtia/basic-cmake 2025-09-07T07:38:46.5415511Z * [new branch] muon_dev -> origin/muon_dev 2025-09-07T07:38:46.5416081Z * [new branch] muon_dev_1 -> origin/muon_dev_1 2025-09-07T07:38:46.5416758Z * [new branch] nativert_num_outputs -> origin/nativert_num_outputs 2025-09-07T07:38:46.5417393Z * [new branch] nativert_numoutputs -> origin/nativert_numoutputs 2025-09-07T07:38:46.5417854Z * [new branch] new-modifiy-setupvllm -> origin/new-modifiy-setupvllm 2025-09-07T07:38:46.5418480Z * [new branch] new-setupvllm -> origin/new-setupvllm 2025-09-07T07:38:46.5419050Z * [new branch] new_zeros_dtype -> origin/new_zeros_dtype 2025-09-07T07:38:46.5419627Z * [new branch] newtest-base -> origin/newtest-base 2025-09-07T07:38:46.5420330Z * [new branch] ngimel/cat_perf1 -> origin/ngimel/cat_perf1 2025-09-07T07:38:46.5420776Z * [new branch] ngimel/einsum_fix -> origin/ngimel/einsum_fix 2025-09-07T07:38:46.5421365Z * [new branch] ngimel/error_index_list -> origin/ngimel/error_index_list 2025-09-07T07:38:46.5421752Z * [new branch] ngimel/fabric_check -> origin/ngimel/fabric_check 2025-09-07T07:38:46.5422264Z * [new branch] ngimel/fabric_fix -> origin/ngimel/fabric_fix 2025-09-07T07:38:46.5422732Z * [new branch] ngimel/fix_driver_init_error -> origin/ngimel/fix_driver_init_error 2025-09-07T07:38:46.5423484Z * [new branch] ngimel/fix_nccl_segment_seg -> origin/ngimel/fix_nccl_segment_seg 2025-09-07T07:38:46.5424076Z * [new branch] ngimel/gg_new -> origin/ngimel/gg_new 2025-09-07T07:38:46.5424805Z * [new branch] ngimel/modeguard -> origin/ngimel/modeguard 2025-09-07T07:38:46.5425495Z * [new branch] ngimel/multicast_fix -> origin/ngimel/multicast_fix 2025-09-07T07:38:46.5425926Z * [new branch] ngimel/rocm_handle_type -> origin/ngimel/rocm_handle_type 2025-09-07T07:38:46.5426514Z * [new branch] ngimel/symm_handle_fabric -> origin/ngimel/symm_handle_fabric 2025-09-07T07:38:46.5426958Z * [new branch] ngimel/unbind_multimem -> origin/ngimel/unbind_multimem 2025-09-07T07:38:46.5427577Z * [new branch] nightly -> origin/nightly 2025-09-07T07:38:46.5428201Z * [new branch] nmacchioni-patch-10 -> origin/nmacchioni-patch-10 2025-09-07T07:38:46.5428802Z * [new branch] nmacchioni-patch-7 -> origin/nmacchioni-patch-7 2025-09-07T07:38:46.5429415Z * [new branch] nmacchioni-patch-8 -> origin/nmacchioni-patch-8 2025-09-07T07:38:46.5430008Z * [new branch] nmacchioni-patch-9 -> origin/nmacchioni-patch-9 2025-09-07T07:38:46.5430800Z * [new branch] nullplay/fuse_matmul -> origin/nullplay/fuse_matmul 2025-09-07T07:38:46.5431279Z * [new branch] nullplay_fuse_matmul -> origin/nullplay_fuse_matmul 2025-09-07T07:38:46.5431868Z * [new branch] one-off -> origin/one-off 2025-09-07T07:38:46.5432862Z * [new branch] orig/release/1.10 -> origin/orig/release/1.10 2025-09-07T07:38:46.5433427Z * [new branch] orig/release/1.11 -> origin/orig/release/1.11 2025-09-07T07:38:46.5434128Z * [new branch] orig/release/1.12 -> origin/orig/release/1.12 2025-09-07T07:38:46.5434802Z * [new branch] orig/release/1.13 -> origin/orig/release/1.13 2025-09-07T07:38:46.5435449Z * [new branch] orig/release/1.6 -> origin/orig/release/1.6 2025-09-07T07:38:46.5436162Z * [new branch] orig/release/1.7 -> origin/orig/release/1.7 2025-09-07T07:38:46.5436659Z * [new branch] orig/release/1.8 -> origin/orig/release/1.8 2025-09-07T07:38:46.5437292Z * [new branch] orig/release/1.9 -> origin/orig/release/1.9 2025-09-07T07:38:46.5437886Z * [new branch] orig/release/2.0 -> origin/orig/release/2.0 2025-09-07T07:38:46.5438302Z * [new branch] orig/release/2.1 -> origin/orig/release/2.1 2025-09-07T07:38:46.5439004Z * [new branch] orig/release/2.2 -> origin/orig/release/2.2 2025-09-07T07:38:46.5439441Z * [new branch] orig/release/2.3 -> origin/orig/release/2.3 2025-09-07T07:38:46.5440060Z * [new branch] orig/release/2.4 -> origin/orig/release/2.4 2025-09-07T07:38:46.5440482Z * [new branch] orig/release/2.5 -> origin/orig/release/2.5 2025-09-07T07:38:46.5441140Z * [new branch] orig/release/2.6 -> origin/orig/release/2.6 2025-09-07T07:38:46.5441782Z * [new branch] orig/release/2.7 -> origin/orig/release/2.7 2025-09-07T07:38:46.5442522Z * [new branch] orig/release/2.8 -> origin/orig/release/2.8 2025-09-07T07:38:46.5443390Z * [new branch] oulgen/fx_graph -> origin/oulgen/fx_graph 2025-09-07T07:38:46.5443959Z * [new branch] padded-tensor -> origin/padded-tensor 2025-09-07T07:38:46.5444515Z * [new branch] pca2 -> origin/pca2 2025-09-07T07:38:46.5445135Z * [new branch] pianpwk-patch-1 -> origin/pianpwk-patch-1 2025-09-07T07:38:46.5445946Z * [new branch] pianpwk/backed_size_oblivious_export -> origin/pianpwk/backed_size_oblivious_export 2025-09-07T07:38:46.5446386Z * [new branch] pianpwk/invalidate_fake_memo -> origin/pianpwk/invalidate_fake_memo 2025-09-07T07:38:46.5446828Z * [new branch] pianpwk/max_1_strides -> origin/pianpwk/max_1_strides 2025-09-07T07:38:46.5447442Z * [new branch] pianpwk/maybe_guard_rel -> origin/pianpwk/maybe_guard_rel 2025-09-07T07:38:46.5447837Z * [new branch] pianpwk/nonzero_memo -> origin/pianpwk/nonzero_memo 2025-09-07T07:38:46.5448587Z * [new branch] pianpwk/oblivious_reshape_view_better -> origin/pianpwk/oblivious_reshape_view_better 2025-09-07T07:38:46.5449318Z * [new branch] pianpwk/oblivious_slice_forward -> origin/pianpwk/oblivious_slice_forward 2025-09-07T07:38:46.5449765Z * [new branch] pianpwk/oblivious_where -> origin/pianpwk/oblivious_where 2025-09-07T07:38:46.5450364Z * [new branch] pianpwk/param_static_pgo -> origin/pianpwk/param_static_pgo 2025-09-07T07:38:46.5450781Z * [new branch] pianpwk/pre_forward_hook -> origin/pianpwk/pre_forward_hook 2025-09-07T07:38:46.5451477Z * [new branch] pianpwk/remove_guard_fail_break -> origin/pianpwk/remove_guard_fail_break 2025-09-07T07:38:46.5452084Z * [new branch] pianpwk/slice_fresh_symbols -> origin/pianpwk/slice_fresh_symbols 2025-09-07T07:38:46.5452511Z * [new branch] pianpwk/sym_tokens_draft -> origin/pianpwk/sym_tokens_draft 2025-09-07T07:38:46.5453280Z * [new branch] pianpwk/test_pointwise_guard_or_false -> origin/pianpwk/test_pointwise_guard_or_false 2025-09-07T07:38:46.5454073Z * [new branch] pianpwk/test_slice_fake_impl -> origin/pianpwk/test_slice_fake_impl 2025-09-07T07:38:46.5454493Z * [new branch] pianpwk/totally_draft_sym_wrap -> origin/pianpwk/totally_draft_sym_wrap 2025-09-07T07:38:46.5454987Z * [new branch] pianpwk/unbacked_channels_last -> origin/pianpwk/unbacked_channels_last 2025-09-07T07:38:46.5455604Z * [new branch] pianpwk/unbacked_safe_conv1d -> origin/pianpwk/unbacked_safe_conv1d 2025-09-07T07:38:46.5456053Z * [new branch] pianpwk/unbacked_sdpa_flash -> origin/pianpwk/unbacked_sdpa_flash 2025-09-07T07:38:46.5456734Z * [new branch] pianpwk/unbacked_should_swap -> origin/pianpwk/unbacked_should_swap 2025-09-07T07:38:46.5457158Z * [new branch] pianpwk/unbacked_should_swap_2 -> origin/pianpwk/unbacked_should_swap_2 2025-09-07T07:38:46.5457669Z * [new branch] pianpwk/unbacked_slice_binding -> origin/pianpwk/unbacked_slice_binding 2025-09-07T07:38:46.5458148Z * [new branch] pianpwk/unbacked_slice_forward -> origin/pianpwk/unbacked_slice_forward 2025-09-07T07:38:46.5458618Z * [new branch] pianpwk/user_symints -> origin/pianpwk/user_symints 2025-09-07T07:38:46.5459087Z * [new branch] pianpwk/wan21_reshape -> origin/pianpwk/wan21_reshape 2025-09-07T07:38:46.5459645Z * [new branch] pianpwk/whitelist_optimizer -> origin/pianpwk/whitelist_optimizer 2025-09-07T07:38:46.5460405Z * [new branch] pin-torchao -> origin/pin-torchao 2025-09-07T07:38:46.5461169Z * [new branch] piz/fall_back_missing_0716 -> origin/piz/fall_back_missing_0716 2025-09-07T07:38:46.5461571Z * [new branch] piz/improve_scatter_0808 -> origin/piz/improve_scatter_0808 2025-09-07T07:38:46.5462174Z * [new branch] pool-separate -> origin/pool-separate 2025-09-07T07:38:46.5462743Z * [new branch] pr-156087 -> origin/pr-156087 2025-09-07T07:38:46.5463471Z * [new branch] pr/131860 -> origin/pr/131860 2025-09-07T07:38:46.5464057Z * [new branch] predispatch_to -> origin/predispatch_to 2025-09-07T07:38:46.5464707Z * [new branch] pt-opt-cuda3 -> origin/pt-opt-cuda3 2025-09-07T07:38:46.5465285Z * [new branch] pyobjectslot -> origin/pyobjectslot 2025-09-07T07:38:46.5466064Z * [new branch] python_compiled_autograd -> origin/python_compiled_autograd 2025-09-07T07:38:46.5467123Z * [new branch] qchip/export-D54134695 -> origin/qchip/export-D54134695 2025-09-07T07:38:46.5467613Z * [new branch] quint-bits -> origin/quint-bits 2025-09-07T07:38:46.5468525Z * [new branch] release/1.10 -> origin/release/1.10 2025-09-07T07:38:46.5469092Z * [new branch] release/1.11 -> origin/release/1.11 2025-09-07T07:38:46.5469802Z * [new branch] release/1.12 -> origin/release/1.12 2025-09-07T07:38:46.5470421Z * [new branch] release/1.13 -> origin/release/1.13 2025-09-07T07:38:46.5470821Z * [new branch] release/1.4 -> origin/release/1.4 2025-09-07T07:38:46.5471251Z * [new branch] release/1.4.1 -> origin/release/1.4.1 2025-09-07T07:38:46.5471873Z * [new branch] release/1.5 -> origin/release/1.5 2025-09-07T07:38:46.5472475Z * [new branch] release/1.6 -> origin/release/1.6 2025-09-07T07:38:46.5473036Z * [new branch] release/1.7 -> origin/release/1.7 2025-09-07T07:38:46.5473733Z * [new branch] release/1.8 -> origin/release/1.8 2025-09-07T07:38:46.5474173Z * [new branch] release/1.9 -> origin/release/1.9 2025-09-07T07:38:46.5475167Z * [new branch] release/2.0 -> origin/release/2.0 2025-09-07T07:38:46.5475781Z * [new branch] release/2.1 -> origin/release/2.1 2025-09-07T07:38:46.5476349Z * [new branch] release/2.2 -> origin/release/2.2 2025-09-07T07:38:46.5477103Z * [new branch] release/2.3 -> origin/release/2.3 2025-09-07T07:38:46.5477843Z * [new branch] release/2.4 -> origin/release/2.4 2025-09-07T07:38:46.5478593Z * [new branch] release/2.5 -> origin/release/2.5 2025-09-07T07:38:46.5479309Z * [new branch] release/2.6 -> origin/release/2.6 2025-09-07T07:38:46.5479907Z * [new branch] release/2.7 -> origin/release/2.7 2025-09-07T07:38:46.5480508Z * [new branch] release/2.8 -> origin/release/2.8 2025-09-07T07:38:46.5481120Z * [new branch] release_notes -> origin/release_notes 2025-09-07T07:38:46.5481721Z * [new branch] remove-actionable-label -> origin/remove-actionable-label 2025-09-07T07:38:46.5482544Z * [new branch] remove-ao -> origin/remove-ao 2025-09-07T07:38:46.5483248Z * [new branch] removedeprecatedvllmtest -> origin/removedeprecatedvllmtest 2025-09-07T07:38:46.5483766Z * [new branch] replace-pytorch-labs-20250812-195836 -> origin/replace-pytorch-labs-20250812-195836 2025-09-07T07:38:46.5484296Z * [new branch] replace-pytorch-labs-20250812-200248 -> origin/replace-pytorch-labs-20250812-200248 2025-09-07T07:38:46.5484711Z * [new branch] replace-pytorch-labs-20250812-200324 -> origin/replace-pytorch-labs-20250812-200324 2025-09-07T07:38:46.5485418Z * [new branch] replace-pytorch-labs-20250812-204020 -> origin/replace-pytorch-labs-20250812-204020 2025-09-07T07:38:46.5485852Z * [new branch] replace-pytorch-labs-20250812-204125 -> origin/replace-pytorch-labs-20250812-204125 2025-09-07T07:38:46.5486396Z * [new branch] replace-pytorch-labs-20250812-205624 -> origin/replace-pytorch-labs-20250812-205624 2025-09-07T07:38:46.5487745Z * [new branch] revert-131069-gh/krzysztofjordan/1/head -> origin/revert-131069-gh/krzysztofjordan/1/head 2025-09-07T07:38:46.5488852Z * [new branch] revert-131469-gh/andrewor14/51/head -> origin/revert-131469-gh/andrewor14/51/head 2025-09-07T07:38:46.5490239Z * [new branch] revert-156870-gh/skarjala/3/head -> origin/revert-156870-gh/skarjala/3/head 2025-09-07T07:38:46.5491658Z * [new branch] revert-157914-cherry-pick-157503-by-pytorch_bot_bot_ -> origin/revert-157914-cherry-pick-157503-by-pytorch_bot_bot_ 2025-09-07T07:38:46.5491980Z * [new branch] rocm-monitoring -> origin/rocm-monitoring 2025-09-07T07:38:46.5492106Z * [new branch] ruisi/relax_memory -> origin/ruisi/relax_memory 2025-09-07T07:38:46.5492557Z * [new branch] run-torchbench-smoke-test-h100 -> origin/run-torchbench-smoke-test-h100 2025-09-07T07:38:46.5493843Z * [new branch] ryanguo99/cleanup-dynamo-expected-failures -> origin/ryanguo99/cleanup-dynamo-expected-failures 2025-09-07T07:38:46.5494053Z * [new branch] ryanguo99/fix-closure-var -> origin/ryanguo99/fix-closure-var 2025-09-07T07:38:46.5494701Z * [new branch] rzou/faketensor_bench -> origin/rzou/faketensor_bench 2025-09-07T07:38:46.5495046Z * [new branch] rzou/njt -> origin/rzou/njt 2025-09-07T07:38:46.5495672Z * [new branch] rzou/pca -> origin/rzou/pca 2025-09-07T07:38:46.5496212Z * [new branch] rzou/realprop -> origin/rzou/realprop 2025-09-07T07:38:46.5496642Z * [new branch] rzou/setup_context -> origin/rzou/setup_context 2025-09-07T07:38:46.5497695Z * [new branch] sanchitintel/refactor_aten_int8_woq_gemm -> origin/sanchitintel/refactor_aten_int8_woq_gemm 2025-09-07T07:38:46.5498269Z * [new branch] sanchitintel/weird_thing_with_test_cpu_select_algorithm -> origin/sanchitintel/weird_thing_with_test_cpu_select_algorithm 2025-09-07T07:38:46.5498833Z * [new branch] sapling-pr-archive-SS-JIA -> origin/sapling-pr-archive-SS-JIA 2025-09-07T07:38:46.5499530Z * [new branch] save -> origin/save 2025-09-07T07:38:46.5500309Z * [new branch] sdym/2.5.1 -> origin/sdym/2.5.1 2025-09-07T07:38:46.5500953Z * [new branch] seemethere-patch-1 -> origin/seemethere-patch-1 2025-09-07T07:38:46.5501421Z * [new branch] setupvllm -> origin/setupvllm 2025-09-07T07:38:46.5502041Z * [new branch] share_and_pin_fork -> origin/share_and_pin_fork 2025-09-07T07:38:46.5502798Z * [new branch] shengf/fx-xform-perf -> origin/shengf/fx-xform-perf 2025-09-07T07:38:46.5503418Z * [new branch] shikaili_fp8_allgather -> origin/shikaili_fp8_allgather 2025-09-07T07:38:46.5503999Z * [new branch] shoumikhin-patch-1 -> origin/shoumikhin-patch-1 2025-09-07T07:38:46.5504608Z * [new branch] shoumikhin-patch-12 -> origin/shoumikhin-patch-12 2025-09-07T07:38:46.5505143Z * [new branch] simplify-fq-per-channel -> origin/simplify-fq-per-channel 2025-09-07T07:38:46.5505740Z * [new branch] solve-accuracy-fix -> origin/solve-accuracy-fix 2025-09-07T07:38:46.5506486Z * [new branch] soulitzer/stash-tls-ac -> origin/soulitzer/stash-tls-ac 2025-09-07T07:38:46.5507335Z * [new branch] sqzhang/flight4 -> origin/sqzhang/flight4 2025-09-07T07:38:46.5507798Z * [new branch] sqzhang/flight4plus -> origin/sqzhang/flight4plus 2025-09-07T07:38:46.5508595Z * [new branch] sraikund/record_funct_test -> origin/sraikund/record_funct_test 2025-09-07T07:38:46.5509781Z * [new branch] sraikund16/test -> origin/sraikund16/test 2025-09-07T07:38:46.5510388Z * [new branch] stablize-compilation-time -> origin/stablize-compilation-time 2025-09-07T07:38:46.5510850Z * [new branch] standalone-templates -> origin/standalone-templates 2025-09-07T07:38:46.5511583Z * [new branch] standalone_package_weights -> origin/standalone_package_weights 2025-09-07T07:38:46.5512000Z * [new branch] starterTaskUpdate -> origin/starterTaskUpdate 2025-09-07T07:38:46.5512616Z * [new branch] subgraph_fuse -> origin/subgraph_fuse 2025-09-07T07:38:46.5513587Z * [new branch] support-uv-in-collect_env -> origin/support-uv-in-collect_env 2025-09-07T07:38:46.5514037Z * [new branch] sve-poc -> origin/sve-poc 2025-09-07T07:38:46.5514742Z * [new branch] svekars-patch-1 -> origin/svekars-patch-1 2025-09-07T07:38:46.5515389Z * [new branch] switch-bn -> origin/switch-bn 2025-09-07T07:38:46.5515995Z * [new branch] sympy-bottleneck-repro -> origin/sympy-bottleneck-repro 2025-09-07T07:38:46.5516655Z * [new branch] tenpercent/ck_rocm_ci_v3 -> origin/tenpercent/ck_rocm_ci_v3 2025-09-07T07:38:46.5517237Z * [new branch] tensordict_integration -> origin/tensordict_integration 2025-09-07T07:38:46.5517682Z * [new branch] test-7054 -> origin/test-7054 2025-09-07T07:38:46.5518518Z * [new branch] test-move-conda-builds -> origin/test-move-conda-builds 2025-09-07T07:38:46.5519144Z * [new branch] test-myst-markdown-docstring -> origin/test-myst-markdown-docstring 2025-09-07T07:38:46.5519550Z * [new branch] test-old -> origin/test-old 2025-09-07T07:38:46.5520263Z * [new branch] test-vec-migration-internally -> origin/test-vec-migration-internally 2025-09-07T07:38:46.5520901Z * [new branch] test/bmm_heur -> origin/test/bmm_heur 2025-09-07T07:38:46.5521323Z * [new branch] test/inductor -> origin/test/inductor 2025-09-07T07:38:46.5522132Z * [new branch] tianren/flex_paged_attn_fix -> origin/tianren/flex_paged_attn_fix 2025-09-07T07:38:46.5522634Z * [new branch] tianren/flex_paged_attn_fix_temp -> origin/tianren/flex_paged_attn_fix_temp 2025-09-07T07:38:46.5523036Z * [new branch] tianren/test -> origin/tianren/test 2025-09-07T07:38:46.5523675Z * [new branch] tidy_performance_cyy -> origin/tidy_performance_cyy 2025-09-07T07:38:46.5524138Z * [new branch] torchtitan_ep -> origin/torchtitan_ep 2025-09-07T07:38:46.5524875Z * [new branch] trace_fsdp_torchtune_lora -> origin/trace_fsdp_torchtune_lora 2025-09-07T07:38:46.5525328Z * [new branch] traceable_fsdp_unit_tests -> origin/traceable_fsdp_unit_tests 2025-09-07T07:38:46.5525955Z * [new branch] tree_loop_vec_base -> origin/tree_loop_vec_base 2025-09-07T07:38:46.5526535Z * [new branch] tree_vec_base -> origin/tree_vec_base 2025-09-07T07:38:46.5527234Z * [new branch] triton-update -> origin/triton-update 2025-09-07T07:38:46.5527697Z * [new branch] triton_kernel -> origin/triton_kernel 2025-09-07T07:38:46.5528282Z * [new branch] triton_kernel_perf -> origin/triton_kernel_perf 2025-09-07T07:38:46.5528748Z * [new branch] tt_pkg_1908 -> origin/tt_pkg_1908 2025-09-07T07:38:46.5529405Z * [new branch] tweak-transformer-dependabot -> origin/tweak-transformer-dependabot 2025-09-07T07:38:46.5529844Z * [new branch] type_dec -> origin/type_dec 2025-09-07T07:38:46.5530549Z * [new branch] udate-sphinx-dependancies -> origin/udate-sphinx-dependancies 2025-09-07T07:38:46.5531433Z * [new branch] update-audio-commit-hash/16818882925-1712-1 -> origin/update-audio-commit-hash/16818882925-1712-1 2025-09-07T07:38:46.5531813Z * [new branch] update-audio-commit-hash/16895560422-1720-1 -> origin/update-audio-commit-hash/16895560422-1720-1 2025-09-07T07:38:46.5532337Z * [new branch] update-audio-commit-hash/16924174496-1738-1 -> origin/update-audio-commit-hash/16924174496-1738-1 2025-09-07T07:38:46.5532809Z * [new branch] update-audio-commit-hash/17002010821-1749-1 -> origin/update-audio-commit-hash/17002010821-1749-1 2025-09-07T07:38:46.5533293Z * [new branch] update-audio-commit-hash/17056004427-1766-1 -> origin/update-audio-commit-hash/17056004427-1766-1 2025-09-07T07:38:46.5534035Z * [new branch] update-audio-commit-hash/17085054029-1767-1 -> origin/update-audio-commit-hash/17085054029-1767-1 2025-09-07T07:38:46.5534581Z * [new branch] update-audio-commit-hash/17142507405-1771-1 -> origin/update-audio-commit-hash/17142507405-1771-1 2025-09-07T07:38:46.5535348Z * [new branch] update-audio-commit-hash/17168762740-1773-1 -> origin/update-audio-commit-hash/17168762740-1773-1 2025-09-07T07:38:46.5535902Z * [new branch] update-audio-commit-hash/17311174639-1780-1 -> origin/update-audio-commit-hash/17311174639-1780-1 2025-09-07T07:38:46.5536423Z * [new branch] update-audio-commit-hash/17336898740-1781-1 -> origin/update-audio-commit-hash/17336898740-1781-1 2025-09-07T07:38:46.5536916Z * [new branch] update-audio-commit-hash/17389727684-1786-1 -> origin/update-audio-commit-hash/17389727684-1786-1 2025-09-07T07:38:46.5537406Z * [new branch] update-audio-commit-hash/17449538142-1790-1 -> origin/update-audio-commit-hash/17449538142-1790-1 2025-09-07T07:38:46.5537943Z * [new branch] update-audio-commit-hash/17507351808-1794-1 -> origin/update-audio-commit-hash/17507351808-1794-1 2025-09-07T07:38:46.5538579Z * [new branch] update-dynamic-shapes-doc -> origin/update-dynamic-shapes-doc 2025-09-07T07:38:46.5539364Z * [new branch] update-executorch-commit-hash/15694981040-1626-1 -> origin/update-executorch-commit-hash/15694981040-1626-1 2025-09-07T07:38:46.5540044Z * [new branch] update-triton-commit-hash/13663274526-1487-2 -> origin/update-triton-commit-hash/13663274526-1487-2 2025-09-07T07:38:46.5540796Z * [new branch] update-vision-commit-hash/15336342773-1607-1 -> origin/update-vision-commit-hash/15336342773-1607-1 2025-09-07T07:38:46.5541496Z * [new branch] update-vllm-commit-hash/16737365217-1704-1 -> origin/update-vllm-commit-hash/16737365217-1704-1 2025-09-07T07:38:46.5541996Z * [new branch] update-vllm-commit-hash/16843157111-1713-1 -> origin/update-vllm-commit-hash/16843157111-1713-1 2025-09-07T07:38:46.5542461Z * [new branch] update-vllm-commit-hash/16855312394-1714-1 -> origin/update-vllm-commit-hash/16855312394-1714-1 2025-09-07T07:38:46.5542968Z * [new branch] update-vllm-commit-hash/16924174496-1738-1 -> origin/update-vllm-commit-hash/16924174496-1738-1 2025-09-07T07:38:46.5543412Z * [new branch] update-vllm-commit-hash/16952608705-1745-1 -> origin/update-vllm-commit-hash/16952608705-1745-1 2025-09-07T07:38:46.5544134Z * [new branch] update-vllm-commit-hash/16979836546-1748-1 -> origin/update-vllm-commit-hash/16979836546-1748-1 2025-09-07T07:38:46.5544863Z * [new branch] update-vllm-commit-hash/17014576881-1756-1 -> origin/update-vllm-commit-hash/17014576881-1756-1 2025-09-07T07:38:46.5545536Z * [new branch] update-vllm-commit-hash/17027830869-1761-1 -> origin/update-vllm-commit-hash/17027830869-1761-1 2025-09-07T07:38:46.5545965Z * [new branch] update-vllm-commit-hash/17056004427-1766-1 -> origin/update-vllm-commit-hash/17056004427-1766-1 2025-09-07T07:38:46.5546461Z * [new branch] update-vllm-commit-hash/17085054029-1767-1 -> origin/update-vllm-commit-hash/17085054029-1767-1 2025-09-07T07:38:46.5546946Z * [new branch] update-vllm-commit-hash/17113610216-1768-1 -> origin/update-vllm-commit-hash/17113610216-1768-1 2025-09-07T07:38:46.5547664Z * [new branch] update-vllm-commit-hash/17142507405-1771-1 -> origin/update-vllm-commit-hash/17142507405-1771-1 2025-09-07T07:38:46.5548036Z * [new branch] update-vllm-commit-hash/17181878974-1774-1 -> origin/update-vllm-commit-hash/17181878974-1774-1 2025-09-07T07:38:46.5548544Z * [new branch] update-vllm-commit-hash/17311174639-1780-1 -> origin/update-vllm-commit-hash/17311174639-1780-1 2025-09-07T07:38:46.5549045Z * [new branch] update-vllm-commit-hash/17336898740-1781-1 -> origin/update-vllm-commit-hash/17336898740-1781-1 2025-09-07T07:38:46.5549528Z * [new branch] update-vllm-commit-hash/17364352302-1785-1 -> origin/update-vllm-commit-hash/17364352302-1785-1 2025-09-07T07:38:46.5550037Z * [new branch] update-vllm-commit-hash/17389727684-1786-1 -> origin/update-vllm-commit-hash/17389727684-1786-1 2025-09-07T07:38:46.5550533Z * [new branch] update-vllm-commit-hash/17449538142-1790-1 -> origin/update-vllm-commit-hash/17449538142-1790-1 2025-09-07T07:38:46.5551045Z * [new branch] update-vllm-commit-hash/17480069797-1791-1 -> origin/update-vllm-commit-hash/17480069797-1791-1 2025-09-07T07:38:46.5551514Z * [new branch] update-vllm-commit-hash/17507351808-1794-1 -> origin/update-vllm-commit-hash/17507351808-1794-1 2025-09-07T07:38:46.5552442Z * [new branch] update-xla-commit-hash/16873912760-198-1 -> origin/update-xla-commit-hash/16873912760-198-1 2025-09-07T07:38:46.5552832Z * [new branch] update-xla-commit-hash/17034266655-199-1 -> origin/update-xla-commit-hash/17034266655-199-1 2025-09-07T07:38:46.5553343Z * [new branch] update-xla-commit-hash/17202464405-200-1 -> origin/update-xla-commit-hash/17202464405-200-1 2025-09-07T07:38:46.5554066Z * [new branch] update_docs_torch_multinomial_issue#125388 -> origin/update_docs_torch_multinomial_issue#125388 2025-09-07T07:38:46.5554448Z * [new branch] update_executorch_pin -> origin/update_executorch_pin 2025-09-07T07:38:46.5555086Z * [new branch] update_slow_tests_1722488736 -> origin/update_slow_tests_1722488736 2025-09-07T07:38:46.5555728Z * [new branch] update_slow_tests_1722879173 -> origin/update_slow_tests_1722879173 2025-09-07T07:38:46.5556186Z * [new branch] update_slow_tests_1752478971 -> origin/update_slow_tests_1752478971 2025-09-07T07:38:46.5556823Z * [new branch] update_slow_tests_1755502951 -> origin/update_slow_tests_1755502951 2025-09-07T07:38:46.5557305Z * [new branch] update_slow_tests_1756107664 -> origin/update_slow_tests_1756107664 2025-09-07T07:38:46.5557984Z * [new branch] update_submodule_FBGEMM -> origin/update_submodule_FBGEMM 2025-09-07T07:38:46.5558468Z * [new branch] update_submodule_kineto -> origin/update_submodule_kineto 2025-09-07T07:38:46.5559103Z * [new branch] update_submodule_tensorpipe -> origin/update_submodule_tensorpipe 2025-09-07T07:38:46.5559736Z * [new branch] v0.1.2 -> origin/v0.1.2 2025-09-07T07:38:46.5560652Z * [new branch] v1.0.1 -> origin/v1.0.1 2025-09-07T07:38:46.5561352Z * [new branch] v1.0.3 -> origin/v1.0.3 2025-09-07T07:38:46.5561911Z * [new branch] v1.1.0 -> origin/v1.1.0 2025-09-07T07:38:46.5562541Z * [new branch] v1.2.0 -> origin/v1.2.0 2025-09-07T07:38:46.5563125Z * [new branch] v1.3.0 -> origin/v1.3.0 2025-09-07T07:38:46.5563717Z * [new branch] v1.3.1 -> origin/v1.3.1 2025-09-07T07:38:46.5564330Z * [new branch] validate_fn -> origin/validate_fn 2025-09-07T07:38:46.5564991Z * [new branch] validations_2.6 -> origin/validations_2.6 2025-09-07T07:38:46.5565666Z * [new branch] validations_2.8 -> origin/validations_2.8 2025-09-07T07:38:46.5566502Z * [new branch] viable/strict -> origin/viable/strict 2025-09-07T07:38:46.5566949Z * [new branch] vllmbuildci -> origin/vllmbuildci 2025-09-07T07:38:46.5567610Z * [new branch] vllmpin -> origin/vllmpin 2025-09-07T07:38:46.5568379Z * [new branch] wdvr/conda_devcontainer -> origin/wdvr/conda_devcontainer 2025-09-07T07:38:46.5568772Z * [new branch] wdvr/iss_145259 -> origin/wdvr/iss_145259 2025-09-07T07:38:46.5569524Z * [new branch] weight_sharing_cpp -> origin/weight_sharing_cpp 2025-09-07T07:38:46.5570436Z * [new branch] whc/flight4 -> origin/whc/flight4 2025-09-07T07:38:46.5571122Z * [new branch] whc/flight51 -> origin/whc/flight51 2025-09-07T07:38:46.5571582Z * [new branch] whc/flight53 -> origin/whc/flight53 2025-09-07T07:38:46.5572227Z * [new branch] whc/stage2 -> origin/whc/stage2 2025-09-07T07:38:46.5572664Z * [new branch] whc/uneven -> origin/whc/uneven 2025-09-07T07:38:46.5573590Z * [new branch] whc/uneven-merge -> origin/whc/uneven-merge 2025-09-07T07:38:46.5574290Z * [new branch] win_warnings -> origin/win_warnings 2025-09-07T07:38:46.5574759Z * [new branch] windows_libtorch_free -> origin/windows_libtorch_free 2025-09-07T07:38:46.5575330Z * [new branch] workonoldcommit -> origin/workonoldcommit 2025-09-07T07:38:46.5576121Z * [new branch] wychi-autotune-prune-configs-by-shared-mem -> origin/wychi-autotune-prune-configs-by-shared-mem 2025-09-07T07:38:46.5576751Z * [new branch] xmfan/ca_0516 -> origin/xmfan/ca_0516 2025-09-07T07:38:46.5577188Z * [new branch] xmfan/ca_1051b93192 -> origin/xmfan/ca_1051b93192 2025-09-07T07:38:46.5577775Z * [new branch] xmfan/ca_1a722f62c248391fc4a542e8851a5559aa356ae8 -> origin/xmfan/ca_1a722f62c248391fc4a542e8851a5559aa356ae8 2025-09-07T07:38:46.5578108Z * [new branch] xmfan/ca_5a2be192d1 -> origin/xmfan/ca_5a2be192d1 2025-09-07T07:38:46.5578675Z * [new branch] xmfan/ca_9d59b516e9 -> origin/xmfan/ca_9d59b516e9 2025-09-07T07:38:46.5579117Z * [new branch] xmfan/ca_api -> origin/xmfan/ca_api 2025-09-07T07:38:46.5579785Z * [new branch] xmfan/ca_apr8 -> origin/xmfan/ca_apr8 2025-09-07T07:38:46.5580476Z * [new branch] xmfan/ca_base -> origin/xmfan/ca_base 2025-09-07T07:38:46.5581218Z * [new branch] xmfan/ca_cudagraphs -> origin/xmfan/ca_cudagraphs 2025-09-07T07:38:46.5581669Z * [new branch] xmfan/ca_dynamic -> origin/xmfan/ca_dynamic 2025-09-07T07:38:46.5582232Z * [new branch] xmfan/ca_fix_dyn -> origin/xmfan/ca_fix_dyn 2025-09-07T07:38:46.5582714Z * [new branch] xmfan/ca_fix_lowering -> origin/xmfan/ca_fix_lowering 2025-09-07T07:38:46.5583568Z * [new branch] xmfan/ca_fix_polyfills -> origin/xmfan/ca_fix_polyfills 2025-09-07T07:38:46.5583929Z * [new branch] xmfan/ca_jan3 -> origin/xmfan/ca_jan3 2025-09-07T07:38:46.5584776Z * [new branch] xmfan/ca_jun18 -> origin/xmfan/ca_jun18 2025-09-07T07:38:46.5585230Z * [new branch] xmfan/ca_jun24 -> origin/xmfan/ca_jun24 2025-09-07T07:38:46.5585814Z * [new branch] xmfan/ca_mem_base -> origin/xmfan/ca_mem_base 2025-09-07T07:38:46.5586247Z * [new branch] xmfan/ca_mem_fix -> origin/xmfan/ca_mem_fix 2025-09-07T07:38:46.5586859Z * [new branch] xmfan/ca_memory_fix -> origin/xmfan/ca_memory_fix 2025-09-07T07:38:46.5587273Z * [new branch] xmfan/ca_memory_fix_rebased -> origin/xmfan/ca_memory_fix_rebased 2025-09-07T07:38:46.5587933Z * [new branch] xmfan/ca_memory_fix_rebased2 -> origin/xmfan/ca_memory_fix_rebased2 2025-09-07T07:38:46.5588505Z * [new branch] xmfan/ca_move_to_cuda -> origin/xmfan/ca_move_to_cuda 2025-09-07T07:38:46.5588926Z * [new branch] xmfan/ca_nested -> origin/xmfan/ca_nested 2025-09-07T07:38:46.5589497Z * [new branch] xmfan/ca_overhead -> origin/xmfan/ca_overhead 2025-09-07T07:38:46.5589975Z * [new branch] xmfan/ca_overhead_0eba7e5451 -> origin/xmfan/ca_overhead_0eba7e5451 2025-09-07T07:38:46.5590548Z * [new branch] xmfan/ca_scalar -> origin/xmfan/ca_scalar 2025-09-07T07:38:46.5590999Z * [new branch] xmfan/ca_subclass_mem_fix -> origin/xmfan/ca_subclass_mem_fix 2025-09-07T07:38:46.5591624Z * [new branch] xmfan/ca_warm_mem -> origin/xmfan/ca_warm_mem 2025-09-07T07:38:46.5592022Z * [new branch] xmfan/ca_warm_mem_base -> origin/xmfan/ca_warm_mem_base 2025-09-07T07:38:46.5592627Z * [new branch] xmfan/cacu_jun18 -> origin/xmfan/cacu_jun18 2025-09-07T07:38:46.5593027Z * [new branch] xmfan/cacu_jun19 -> origin/xmfan/cacu_jun19 2025-09-07T07:38:46.5593557Z * [new branch] xmfan/cacu_jun4 -> origin/xmfan/cacu_jun4 2025-09-07T07:38:46.5594106Z * [new branch] xmfan/cacu_may27 -> origin/xmfan/cacu_may27 2025-09-07T07:38:46.5594645Z * [new branch] xmfan/disable_duck_shape -> origin/xmfan/disable_duck_shape 2025-09-07T07:38:46.5595118Z * [new branch] xmfan/fca_cpp_node_passthrough -> origin/xmfan/fca_cpp_node_passthrough 2025-09-07T07:38:46.5595731Z * [new branch] xmfan/issue_123374 -> origin/xmfan/issue_123374 2025-09-07T07:38:46.5596409Z * [new branch] xmfan/post_3945954741e2d37023c5d6954f9483008e0892f9 -> origin/xmfan/post_3945954741e2d37023c5d6954f9483008e0892f9 2025-09-07T07:38:46.5596916Z * [new branch] xmfan/pre_3945954741e2d37023c5d6954f9483008e0892f9 -> origin/xmfan/pre_3945954741e2d37023c5d6954f9483008e0892f9 2025-09-07T07:38:46.5597304Z * [new branch] xmfan/segfault_test -> origin/xmfan/segfault_test 2025-09-07T07:38:46.5597813Z * [new branch] xmfan/single_step -> origin/xmfan/single_step 2025-09-07T07:38:46.5598256Z * [new branch] xmfan/sth_0829 -> origin/xmfan/sth_0829 2025-09-07T07:38:46.5598857Z * [new branch] xmfan/test -> origin/xmfan/test 2025-09-07T07:38:46.5599755Z * [new branch] yguo/debug-0226-constexpr -> origin/yguo/debug-0226-constexpr 2025-09-07T07:38:46.5600160Z * [new branch] yguo/new_latest_changes -> origin/yguo/new_latest_changes 2025-09-07T07:38:46.5600762Z * [new branch] yguo/patch_constexpr_changes -> origin/yguo/patch_constexpr_changes 2025-09-07T07:38:46.5601348Z * [new branch] yihan_quantization -> origin/yihan_quantization 2025-09-07T07:38:46.5602101Z * [new branch] yiming/add_jit_trace_benchmark -> origin/yiming/add_jit_trace_benchmark 2025-09-07T07:38:46.5602490Z * [new branch] yiming/add_nativert_benchmark -> origin/yiming/add_nativert_benchmark 2025-09-07T07:38:46.5602944Z * [new branch] yiming/bootcamp -> origin/yiming/bootcamp 2025-09-07T07:38:46.5603731Z * [new branch] zainr/canary-test -> origin/zainr/canary-test 2025-09-07T07:38:46.5604477Z * [new branch] zainr/cleanup-gh-runners -> origin/zainr/cleanup-gh-runners 2025-09-07T07:38:46.5604857Z * [new branch] zainr/git-push-v2 -> origin/zainr/git-push-v2 2025-09-07T07:38:46.5605339Z * [new branch] zainr/pull-migration-c -> origin/zainr/pull-migration-c 2025-09-07T07:38:46.5605932Z * [new branch] zainr/test -> origin/zainr/test 2025-09-07T07:38:46.5606336Z * [new branch] zainr/test2 -> origin/zainr/test2 2025-09-07T07:38:46.5606802Z * [new branch] zainr/unstable -> origin/zainr/unstable 2025-09-07T07:38:46.5607422Z * [new branch] zainr/unstable-xla -> origin/zainr/unstable-xla 2025-09-07T07:38:46.5608063Z * [new branch] zasdfgbnm-patch-3 -> origin/zasdfgbnm-patch-3 2025-09-07T07:38:46.5608614Z * [new branch] zb2p -> origin/zb2p 2025-09-07T07:38:46.5609289Z * [new branch] zero_grad_optimization -> origin/zero_grad_optimization 2025-09-07T07:38:46.5609760Z * [new branch] zeros-and-scatter-part2 -> origin/zeros-and-scatter-part2 2025-09-07T07:38:46.5610705Z * [new branch] zhxchen17/scratch/0 -> origin/zhxchen17/scratch/0 2025-09-07T07:38:46.5611481Z * [new branch] zhxhcen17/moodycamel -> origin/zhxhcen17/moodycamel 2025-09-07T07:38:46.5612237Z * [new branch] zxiiro/main -> origin/zxiiro/main 2025-09-07T07:38:46.5612840Z * [new tag] bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug -> bc2caa7fdf006894eff7af936babde69ab5a40f8-huydhn-debug 2025-09-07T07:38:46.5613228Z * [new tag] ci/binaries/77164 -> ci/binaries/77164 2025-09-07T07:38:46.5613913Z * [new tag] ciflow/binaries/156049 -> ciflow/binaries/156049 2025-09-07T07:38:46.5614147Z * [new tag] ciflow/binaries/156712 -> ciflow/binaries/156712 2025-09-07T07:38:46.5614574Z * [new tag] ciflow/binaries/157432 -> ciflow/binaries/157432 2025-09-07T07:38:46.5614928Z * [new tag] ciflow/binaries/157685 -> ciflow/binaries/157685 2025-09-07T07:38:46.5615205Z * [new tag] ciflow/binaries/157689 -> ciflow/binaries/157689 2025-09-07T07:38:46.5615527Z * [new tag] ciflow/binaries/158104 -> ciflow/binaries/158104 2025-09-07T07:38:46.5615948Z * [new tag] ciflow/binaries/160229 -> ciflow/binaries/160229 2025-09-07T07:38:46.5616277Z * [new tag] ciflow/binaries/160720 -> ciflow/binaries/160720 2025-09-07T07:38:46.5616612Z * [new tag] ciflow/binaries/162080 -> ciflow/binaries/162080 2025-09-07T07:38:46.5616907Z * [new tag] ciflow/binaries/162329 -> ciflow/binaries/162329 2025-09-07T07:38:46.5617378Z * [new tag] ciflow/binaries_libtorch/156049 -> ciflow/binaries_libtorch/156049 2025-09-07T07:38:46.5617743Z * [new tag] ciflow/binaries_libtorch/156711 -> ciflow/binaries_libtorch/156711 2025-09-07T07:38:46.5618036Z * [new tag] ciflow/binaries_libtorch/157432 -> ciflow/binaries_libtorch/157432 2025-09-07T07:38:46.5618435Z * [new tag] ciflow/binaries_wheel/156049 -> ciflow/binaries_wheel/156049 2025-09-07T07:38:46.5619177Z * [new tag] ciflow/binaries_wheel/156711 -> ciflow/binaries_wheel/156711 2025-09-07T07:38:46.5619386Z * [new tag] ciflow/binaries_wheel/157432 -> ciflow/binaries_wheel/157432 2025-09-07T07:38:46.5619737Z * [new tag] ciflow/binaries_wheel/162136 -> ciflow/binaries_wheel/162136 2025-09-07T07:38:46.5620082Z * [new tag] ciflow/binaries_wheel/162252 -> ciflow/binaries_wheel/162252 2025-09-07T07:38:46.5620365Z * [new tag] ciflow/binaries_wheel/162325 -> ciflow/binaries_wheel/162325 2025-09-07T07:38:46.5620967Z * [new tag] ciflow/h100-distributed/156703 -> ciflow/h100-distributed/156703 2025-09-07T07:38:46.5621203Z * [new tag] ciflow/h100-symm-mem/157635 -> ciflow/h100-symm-mem/157635 2025-09-07T07:38:46.5621549Z * [new tag] ciflow/h100-symm-mem/161984 -> ciflow/h100-symm-mem/161984 2025-09-07T07:38:46.5621859Z * [new tag] ciflow/h100-symm-mem/162003 -> ciflow/h100-symm-mem/162003 2025-09-07T07:38:46.5622218Z * [new tag] ciflow/h100-symm-mem/162011 -> ciflow/h100-symm-mem/162011 2025-09-07T07:38:46.5622479Z * [new tag] ciflow/h100-symm-mem/162026 -> ciflow/h100-symm-mem/162026 2025-09-07T07:38:46.5622832Z * [new tag] ciflow/h100-symm-mem/162033 -> ciflow/h100-symm-mem/162033 2025-09-07T07:38:46.5623119Z * [new tag] ciflow/h100-symm-mem/162040 -> ciflow/h100-symm-mem/162040 2025-09-07T07:38:46.5623414Z * [new tag] ciflow/h100-symm-mem/162041 -> ciflow/h100-symm-mem/162041 2025-09-07T07:38:46.5623751Z * [new tag] ciflow/h100-symm-mem/162142 -> ciflow/h100-symm-mem/162142 2025-09-07T07:38:46.5624180Z * [new tag] ciflow/h100-symm-mem/162150 -> ciflow/h100-symm-mem/162150 2025-09-07T07:38:46.5624940Z * [new tag] ciflow/h100-symm-mem/162243 -> ciflow/h100-symm-mem/162243 2025-09-07T07:38:46.5625323Z * [new tag] ciflow/h100-symm-mem/162320 -> ciflow/h100-symm-mem/162320 2025-09-07T07:38:46.5625747Z * [new tag] ciflow/h100/159158 -> ciflow/h100/159158 2025-09-07T07:38:46.5626523Z * [new tag] ciflow/h100/160480 -> ciflow/h100/160480 2025-09-07T07:38:46.5626814Z * [new tag] ciflow/h100/161749 -> ciflow/h100/161749 2025-09-07T07:38:46.5627174Z * [new tag] ciflow/h100/162022 -> ciflow/h100/162022 2025-09-07T07:38:46.5627530Z * [new tag] ciflow/h100/162278 -> ciflow/h100/162278 2025-09-07T07:38:46.5628101Z * [new tag] ciflow/inductor-perf-test-nightly-rocm/156592 -> ciflow/inductor-perf-test-nightly-rocm/156592 2025-09-07T07:38:46.5628563Z * [new tag] ciflow/inductor-perf-test-nightly/156592 -> ciflow/inductor-perf-test-nightly/156592 2025-09-07T07:38:46.5628940Z * [new tag] ciflow/inductor-periodic/162063 -> ciflow/inductor-periodic/162063 2025-09-07T07:38:46.5629243Z * [new tag] ciflow/inductor-periodic/162227 -> ciflow/inductor-periodic/162227 2025-09-07T07:38:46.5629647Z * [new tag] ciflow/inductor-periodic/162323 -> ciflow/inductor-periodic/162323 2025-09-07T07:38:46.5630099Z * [new tag] ciflow/inductor-rocm/154170 -> ciflow/inductor-rocm/154170 2025-09-07T07:38:46.5630680Z * [new tag] ciflow/inductor-rocm/159146 -> ciflow/inductor-rocm/159146 2025-09-07T07:38:46.5630886Z * [new tag] ciflow/inductor-rocm/159158 -> ciflow/inductor-rocm/159158 2025-09-07T07:38:46.5631329Z * [new tag] ciflow/inductor-rocm/161715 -> ciflow/inductor-rocm/161715 2025-09-07T07:38:46.5631788Z * [new tag] ciflow/inductor-rocm/162053 -> ciflow/inductor-rocm/162053 2025-09-07T07:38:46.5632242Z * [new tag] ciflow/inductor-rocm/162056 -> ciflow/inductor-rocm/162056 2025-09-07T07:38:46.5632617Z * [new tag] ciflow/inductor/137400 -> ciflow/inductor/137400 2025-09-07T07:38:46.5632971Z * [new tag] ciflow/inductor/148180 -> ciflow/inductor/148180 2025-09-07T07:38:46.5633265Z * [new tag] ciflow/inductor/148328 -> ciflow/inductor/148328 2025-09-07T07:38:46.5633563Z * [new tag] ciflow/inductor/148484 -> ciflow/inductor/148484 2025-09-07T07:38:46.5633921Z * [new tag] ciflow/inductor/148492 -> ciflow/inductor/148492 2025-09-07T07:38:46.5634231Z * [new tag] ciflow/inductor/152624 -> ciflow/inductor/152624 2025-09-07T07:38:46.5634533Z * [new tag] ciflow/inductor/154694 -> ciflow/inductor/154694 2025-09-07T07:38:46.5634855Z * [new tag] ciflow/inductor/156049 -> ciflow/inductor/156049 2025-09-07T07:38:46.5635252Z * [new tag] ciflow/inductor/156592 -> ciflow/inductor/156592 2025-09-07T07:38:46.5635547Z * [new tag] ciflow/inductor/157635 -> ciflow/inductor/157635 2025-09-07T07:38:46.5636042Z * [new tag] ciflow/inductor/157685 -> ciflow/inductor/157685 2025-09-07T07:38:46.5636457Z * [new tag] ciflow/inductor/157686 -> ciflow/inductor/157686 2025-09-07T07:38:46.5636878Z * [new tag] ciflow/inductor/157689 -> ciflow/inductor/157689 2025-09-07T07:38:46.5637435Z * [new tag] ciflow/inductor/157699 -> ciflow/inductor/157699 2025-09-07T07:38:46.5637768Z * [new tag] ciflow/inductor/157743 -> ciflow/inductor/157743 2025-09-07T07:38:46.5638149Z * [new tag] ciflow/inductor/157994 -> ciflow/inductor/157994 2025-09-07T07:38:46.5638514Z * [new tag] ciflow/inductor/158091 -> ciflow/inductor/158091 2025-09-07T07:38:46.5638834Z * [new tag] ciflow/inductor/158104 -> ciflow/inductor/158104 2025-09-07T07:38:46.5639257Z * [new tag] ciflow/inductor/158404 -> ciflow/inductor/158404 2025-09-07T07:38:46.5639554Z * [new tag] ciflow/inductor/158647 -> ciflow/inductor/158647 2025-09-07T07:38:46.5640018Z * [new tag] ciflow/inductor/158932 -> ciflow/inductor/158932 2025-09-07T07:38:46.5640362Z * [new tag] ciflow/inductor/159146 -> ciflow/inductor/159146 2025-09-07T07:38:46.5640650Z * [new tag] ciflow/inductor/159158 -> ciflow/inductor/159158 2025-09-07T07:38:46.5641164Z * [new tag] ciflow/inductor/159274 -> ciflow/inductor/159274 2025-09-07T07:38:46.5641423Z * [new tag] ciflow/inductor/159664 -> ciflow/inductor/159664 2025-09-07T07:38:46.5641933Z * [new tag] ciflow/inductor/159778 -> ciflow/inductor/159778 2025-09-07T07:38:46.5642211Z * [new tag] ciflow/inductor/159835 -> ciflow/inductor/159835 2025-09-07T07:38:46.5642695Z * [new tag] ciflow/inductor/159944 -> ciflow/inductor/159944 2025-09-07T07:38:46.5643102Z * [new tag] ciflow/inductor/160161 -> ciflow/inductor/160161 2025-09-07T07:38:46.5643469Z * [new tag] ciflow/inductor/160174 -> ciflow/inductor/160174 2025-09-07T07:38:46.5643867Z * [new tag] ciflow/inductor/160323 -> ciflow/inductor/160323 2025-09-07T07:38:46.5644455Z * [new tag] ciflow/inductor/160324 -> ciflow/inductor/160324 2025-09-07T07:38:46.5644818Z * [new tag] ciflow/inductor/160325 -> ciflow/inductor/160325 2025-09-07T07:38:46.5645379Z * [new tag] ciflow/inductor/160326 -> ciflow/inductor/160326 2025-09-07T07:38:46.5645686Z * [new tag] ciflow/inductor/160327 -> ciflow/inductor/160327 2025-09-07T07:38:46.5646385Z * [new tag] ciflow/inductor/160328 -> ciflow/inductor/160328 2025-09-07T07:38:46.5646724Z * [new tag] ciflow/inductor/160329 -> ciflow/inductor/160329 2025-09-07T07:38:46.5647098Z * [new tag] ciflow/inductor/160480 -> ciflow/inductor/160480 2025-09-07T07:38:46.5647650Z * [new tag] ciflow/inductor/160532 -> ciflow/inductor/160532 2025-09-07T07:38:46.5648573Z * [new tag] ciflow/inductor/160539 -> ciflow/inductor/160539 2025-09-07T07:38:46.5648874Z * [new tag] ciflow/inductor/160580 -> ciflow/inductor/160580 2025-09-07T07:38:46.5649233Z * [new tag] ciflow/inductor/160685 -> ciflow/inductor/160685 2025-09-07T07:38:46.5649591Z * [new tag] ciflow/inductor/160686 -> ciflow/inductor/160686 2025-09-07T07:38:46.5649902Z * [new tag] ciflow/inductor/160687 -> ciflow/inductor/160687 2025-09-07T07:38:46.5650287Z * [new tag] ciflow/inductor/160688 -> ciflow/inductor/160688 2025-09-07T07:38:46.5650588Z * [new tag] ciflow/inductor/160690 -> ciflow/inductor/160690 2025-09-07T07:38:46.5650890Z * [new tag] ciflow/inductor/160706 -> ciflow/inductor/160706 2025-09-07T07:38:46.5651293Z * [new tag] ciflow/inductor/160729 -> ciflow/inductor/160729 2025-09-07T07:38:46.5651653Z * [new tag] ciflow/inductor/160798 -> ciflow/inductor/160798 2025-09-07T07:38:46.5652224Z * [new tag] ciflow/inductor/160836 -> ciflow/inductor/160836 2025-09-07T07:38:46.5652486Z * [new tag] ciflow/inductor/160843 -> ciflow/inductor/160843 2025-09-07T07:38:46.5653109Z * [new tag] ciflow/inductor/160869 -> ciflow/inductor/160869 2025-09-07T07:38:46.5653343Z * [new tag] ciflow/inductor/160920 -> ciflow/inductor/160920 2025-09-07T07:38:46.5653691Z * [new tag] ciflow/inductor/160928 -> ciflow/inductor/160928 2025-09-07T07:38:46.5654036Z * [new tag] ciflow/inductor/160943 -> ciflow/inductor/160943 2025-09-07T07:38:46.5654353Z * [new tag] ciflow/inductor/161092 -> ciflow/inductor/161092 2025-09-07T07:38:46.5654708Z * [new tag] ciflow/inductor/161093 -> ciflow/inductor/161093 2025-09-07T07:38:46.5655097Z * [new tag] ciflow/inductor/161109 -> ciflow/inductor/161109 2025-09-07T07:38:46.5655384Z * [new tag] ciflow/inductor/161118 -> ciflow/inductor/161118 2025-09-07T07:38:46.5655943Z * [new tag] ciflow/inductor/161178 -> ciflow/inductor/161178 2025-09-07T07:38:46.5656226Z * [new tag] ciflow/inductor/161246 -> ciflow/inductor/161246 2025-09-07T07:38:46.5656552Z * [new tag] ciflow/inductor/161349 -> ciflow/inductor/161349 2025-09-07T07:38:46.5656908Z * [new tag] ciflow/inductor/161350 -> ciflow/inductor/161350 2025-09-07T07:38:46.5657212Z * [new tag] ciflow/inductor/161351 -> ciflow/inductor/161351 2025-09-07T07:38:46.5657616Z * [new tag] ciflow/inductor/161397 -> ciflow/inductor/161397 2025-09-07T07:38:46.5657935Z * [new tag] ciflow/inductor/161404 -> ciflow/inductor/161404 2025-09-07T07:38:46.5658291Z * [new tag] ciflow/inductor/161405 -> ciflow/inductor/161405 2025-09-07T07:38:46.5658640Z * [new tag] ciflow/inductor/161406 -> ciflow/inductor/161406 2025-09-07T07:38:46.5659223Z * [new tag] ciflow/inductor/161410 -> ciflow/inductor/161410 2025-09-07T07:38:46.5659502Z * [new tag] ciflow/inductor/161414 -> ciflow/inductor/161414 2025-09-07T07:38:46.5660102Z * [new tag] ciflow/inductor/161442 -> ciflow/inductor/161442 2025-09-07T07:38:46.5660372Z * [new tag] ciflow/inductor/161458 -> ciflow/inductor/161458 2025-09-07T07:38:46.5660750Z * [new tag] ciflow/inductor/161468 -> ciflow/inductor/161468 2025-09-07T07:38:46.5661098Z * [new tag] ciflow/inductor/161469 -> ciflow/inductor/161469 2025-09-07T07:38:46.5661481Z * [new tag] ciflow/inductor/161485 -> ciflow/inductor/161485 2025-09-07T07:38:46.5661816Z * [new tag] ciflow/inductor/161499 -> ciflow/inductor/161499 2025-09-07T07:38:46.5662145Z * [new tag] ciflow/inductor/161534 -> ciflow/inductor/161534 2025-09-07T07:38:46.5662499Z * [new tag] ciflow/inductor/161595 -> ciflow/inductor/161595 2025-09-07T07:38:46.5662802Z * [new tag] ciflow/inductor/161596 -> ciflow/inductor/161596 2025-09-07T07:38:46.5663528Z * [new tag] ciflow/inductor/161630 -> ciflow/inductor/161630 2025-09-07T07:38:46.5663870Z * [new tag] ciflow/inductor/161667 -> ciflow/inductor/161667 2025-09-07T07:38:46.5664170Z * [new tag] ciflow/inductor/161670 -> ciflow/inductor/161670 2025-09-07T07:38:46.5664523Z * [new tag] ciflow/inductor/161673 -> ciflow/inductor/161673 2025-09-07T07:38:46.5664865Z * [new tag] ciflow/inductor/161674 -> ciflow/inductor/161674 2025-09-07T07:38:46.5665190Z * [new tag] ciflow/inductor/161675 -> ciflow/inductor/161675 2025-09-07T07:38:46.5665491Z * [new tag] ciflow/inductor/161693 -> ciflow/inductor/161693 2025-09-07T07:38:46.5665883Z * [new tag] ciflow/inductor/161695 -> ciflow/inductor/161695 2025-09-07T07:38:46.5666179Z * [new tag] ciflow/inductor/161715 -> ciflow/inductor/161715 2025-09-07T07:38:46.5666576Z * [new tag] ciflow/inductor/161730 -> ciflow/inductor/161730 2025-09-07T07:38:46.5666846Z * [new tag] ciflow/inductor/161732 -> ciflow/inductor/161732 2025-09-07T07:38:46.5667235Z * [new tag] ciflow/inductor/161744 -> ciflow/inductor/161744 2025-09-07T07:38:46.5667580Z * [new tag] ciflow/inductor/161746 -> ciflow/inductor/161746 2025-09-07T07:38:46.5667940Z * [new tag] ciflow/inductor/161747 -> ciflow/inductor/161747 2025-09-07T07:38:46.5668239Z * [new tag] ciflow/inductor/161819 -> ciflow/inductor/161819 2025-09-07T07:38:46.5668555Z * [new tag] ciflow/inductor/161821 -> ciflow/inductor/161821 2025-09-07T07:38:46.5668891Z * [new tag] ciflow/inductor/161828 -> ciflow/inductor/161828 2025-09-07T07:38:46.5669220Z * [new tag] ciflow/inductor/161879 -> ciflow/inductor/161879 2025-09-07T07:38:46.5669552Z * [new tag] ciflow/inductor/161880 -> ciflow/inductor/161880 2025-09-07T07:38:46.5670147Z * [new tag] ciflow/inductor/161881 -> ciflow/inductor/161881 2025-09-07T07:38:46.5670789Z * [new tag] ciflow/inductor/161907 -> ciflow/inductor/161907 2025-09-07T07:38:46.5671028Z * [new tag] ciflow/inductor/161914 -> ciflow/inductor/161914 2025-09-07T07:38:46.5671481Z * [new tag] ciflow/inductor/161924 -> ciflow/inductor/161924 2025-09-07T07:38:46.5671844Z * [new tag] ciflow/inductor/161936 -> ciflow/inductor/161936 2025-09-07T07:38:46.5672191Z * [new tag] ciflow/inductor/161938 -> ciflow/inductor/161938 2025-09-07T07:38:46.5672624Z * [new tag] ciflow/inductor/161939 -> ciflow/inductor/161939 2025-09-07T07:38:46.5672981Z * [new tag] ciflow/inductor/161940 -> ciflow/inductor/161940 2025-09-07T07:38:46.5673282Z * [new tag] ciflow/inductor/161955 -> ciflow/inductor/161955 2025-09-07T07:38:46.5673694Z * [new tag] ciflow/inductor/161957 -> ciflow/inductor/161957 2025-09-07T07:38:46.5674008Z * [new tag] ciflow/inductor/161975 -> ciflow/inductor/161975 2025-09-07T07:38:46.5674369Z * [new tag] ciflow/inductor/161977 -> ciflow/inductor/161977 2025-09-07T07:38:46.5674685Z * [new tag] ciflow/inductor/161978 -> ciflow/inductor/161978 2025-09-07T07:38:46.5675041Z * [new tag] ciflow/inductor/161979 -> ciflow/inductor/161979 2025-09-07T07:38:46.5675391Z * [new tag] ciflow/inductor/161980 -> ciflow/inductor/161980 2025-09-07T07:38:46.5675747Z * [new tag] ciflow/inductor/161988 -> ciflow/inductor/161988 2025-09-07T07:38:46.5676051Z * [new tag] ciflow/inductor/161994 -> ciflow/inductor/161994 2025-09-07T07:38:46.5676378Z * [new tag] ciflow/inductor/162013 -> ciflow/inductor/162013 2025-09-07T07:38:46.5676714Z * [new tag] ciflow/inductor/162014 -> ciflow/inductor/162014 2025-09-07T07:38:46.5677073Z * [new tag] ciflow/inductor/162017 -> ciflow/inductor/162017 2025-09-07T07:38:46.5677385Z * [new tag] ciflow/inductor/162021 -> ciflow/inductor/162021 2025-09-07T07:38:46.5677750Z * [new tag] ciflow/inductor/162023 -> ciflow/inductor/162023 2025-09-07T07:38:46.5678084Z * [new tag] ciflow/inductor/162027 -> ciflow/inductor/162027 2025-09-07T07:38:46.5678402Z * [new tag] ciflow/inductor/162029 -> ciflow/inductor/162029 2025-09-07T07:38:46.5678713Z * [new tag] ciflow/inductor/162030 -> ciflow/inductor/162030 2025-09-07T07:38:46.5679029Z * [new tag] ciflow/inductor/162031 -> ciflow/inductor/162031 2025-09-07T07:38:46.5679389Z * [new tag] ciflow/inductor/162033 -> ciflow/inductor/162033 2025-09-07T07:38:46.5679935Z * [new tag] ciflow/inductor/162052 -> ciflow/inductor/162052 2025-09-07T07:38:46.5680153Z * [new tag] ciflow/inductor/162053 -> ciflow/inductor/162053 2025-09-07T07:38:46.5680711Z * [new tag] ciflow/inductor/162056 -> ciflow/inductor/162056 2025-09-07T07:38:46.5681043Z * [new tag] ciflow/inductor/162063 -> ciflow/inductor/162063 2025-09-07T07:38:46.5687328Z * [new tag] ciflow/inductor/162066 -> ciflow/inductor/162066 2025-09-07T07:38:46.5687590Z * [new tag] ciflow/inductor/162068 -> ciflow/inductor/162068 2025-09-07T07:38:46.5688209Z * [new tag] ciflow/inductor/162081 -> ciflow/inductor/162081 2025-09-07T07:38:46.5688493Z * [new tag] ciflow/inductor/162088 -> ciflow/inductor/162088 2025-09-07T07:38:46.5688808Z * [new tag] ciflow/inductor/162089 -> ciflow/inductor/162089 2025-09-07T07:38:46.5689160Z * [new tag] ciflow/inductor/162094 -> ciflow/inductor/162094 2025-09-07T07:38:46.5689487Z * [new tag] ciflow/inductor/162098 -> ciflow/inductor/162098 2025-09-07T07:38:46.5689888Z * [new tag] ciflow/inductor/162101 -> ciflow/inductor/162101 2025-09-07T07:38:46.5690237Z * [new tag] ciflow/inductor/162102 -> ciflow/inductor/162102 2025-09-07T07:38:46.5690553Z * [new tag] ciflow/inductor/162104 -> ciflow/inductor/162104 2025-09-07T07:38:46.5690940Z * [new tag] ciflow/inductor/162106 -> ciflow/inductor/162106 2025-09-07T07:38:46.5691250Z * [new tag] ciflow/inductor/162108 -> ciflow/inductor/162108 2025-09-07T07:38:46.5691629Z * [new tag] ciflow/inductor/162126 -> ciflow/inductor/162126 2025-09-07T07:38:46.5691945Z * [new tag] ciflow/inductor/162149 -> ciflow/inductor/162149 2025-09-07T07:38:46.5692275Z * [new tag] ciflow/inductor/162164 -> ciflow/inductor/162164 2025-09-07T07:38:46.5692619Z * [new tag] ciflow/inductor/162166 -> ciflow/inductor/162166 2025-09-07T07:38:46.5692932Z * [new tag] ciflow/inductor/162169 -> ciflow/inductor/162169 2025-09-07T07:38:46.5693295Z * [new tag] ciflow/inductor/162170 -> ciflow/inductor/162170 2025-09-07T07:38:46.5693650Z * [new tag] ciflow/inductor/162171 -> ciflow/inductor/162171 2025-09-07T07:38:46.5693948Z * [new tag] ciflow/inductor/162183 -> ciflow/inductor/162183 2025-09-07T07:38:46.5694302Z * [new tag] ciflow/inductor/162189 -> ciflow/inductor/162189 2025-09-07T07:38:46.5694613Z * [new tag] ciflow/inductor/162190 -> ciflow/inductor/162190 2025-09-07T07:38:46.5694947Z * [new tag] ciflow/inductor/162191 -> ciflow/inductor/162191 2025-09-07T07:38:46.5695304Z * [new tag] ciflow/inductor/162194 -> ciflow/inductor/162194 2025-09-07T07:38:46.5695734Z * [new tag] ciflow/inductor/162200 -> ciflow/inductor/162200 2025-09-07T07:38:46.5696061Z * [new tag] ciflow/inductor/162201 -> ciflow/inductor/162201 2025-09-07T07:38:46.5696393Z * [new tag] ciflow/inductor/162208 -> ciflow/inductor/162208 2025-09-07T07:38:46.5697114Z * [new tag] ciflow/inductor/162211 -> ciflow/inductor/162211 2025-09-07T07:38:46.5697351Z * [new tag] ciflow/inductor/162216 -> ciflow/inductor/162216 2025-09-07T07:38:46.5697678Z * [new tag] ciflow/inductor/162220 -> ciflow/inductor/162220 2025-09-07T07:38:46.5698076Z * [new tag] ciflow/inductor/162222 -> ciflow/inductor/162222 2025-09-07T07:38:46.5698448Z * [new tag] ciflow/inductor/162227 -> ciflow/inductor/162227 2025-09-07T07:38:46.5698746Z * [new tag] ciflow/inductor/162238 -> ciflow/inductor/162238 2025-09-07T07:38:46.5699120Z * [new tag] ciflow/inductor/162239 -> ciflow/inductor/162239 2025-09-07T07:38:46.5699805Z * [new tag] ciflow/inductor/162240 -> ciflow/inductor/162240 2025-09-07T07:38:46.5700073Z * [new tag] ciflow/inductor/162244 -> ciflow/inductor/162244 2025-09-07T07:38:46.5700450Z * [new tag] ciflow/inductor/162245 -> ciflow/inductor/162245 2025-09-07T07:38:46.5700751Z * [new tag] ciflow/inductor/162262 -> ciflow/inductor/162262 2025-09-07T07:38:46.5701115Z * [new tag] ciflow/inductor/162275 -> ciflow/inductor/162275 2025-09-07T07:38:46.5701598Z * [new tag] ciflow/inductor/162278 -> ciflow/inductor/162278 2025-09-07T07:38:46.5701932Z * [new tag] ciflow/inductor/162284 -> ciflow/inductor/162284 2025-09-07T07:38:46.5702229Z * [new tag] ciflow/inductor/162286 -> ciflow/inductor/162286 2025-09-07T07:38:46.5702570Z * [new tag] ciflow/inductor/162288 -> ciflow/inductor/162288 2025-09-07T07:38:46.5702907Z * [new tag] ciflow/inductor/162293 -> ciflow/inductor/162293 2025-09-07T07:38:46.5703267Z * [new tag] ciflow/inductor/162294 -> ciflow/inductor/162294 2025-09-07T07:38:46.5703578Z * [new tag] ciflow/inductor/162295 -> ciflow/inductor/162295 2025-09-07T07:38:46.5703949Z * [new tag] ciflow/inductor/162296 -> ciflow/inductor/162296 2025-09-07T07:38:46.5704238Z * [new tag] ciflow/inductor/162298 -> ciflow/inductor/162298 2025-09-07T07:38:46.5704567Z * [new tag] ciflow/inductor/162307 -> ciflow/inductor/162307 2025-09-07T07:38:46.5704911Z * [new tag] ciflow/inductor/162309 -> ciflow/inductor/162309 2025-09-07T07:38:46.5705237Z * [new tag] ciflow/inductor/162311 -> ciflow/inductor/162311 2025-09-07T07:38:46.5705594Z * [new tag] ciflow/inductor/162312 -> ciflow/inductor/162312 2025-09-07T07:38:46.5705965Z * [new tag] ciflow/inductor/162315 -> ciflow/inductor/162315 2025-09-07T07:38:46.5706266Z * [new tag] ciflow/inductor/162316 -> ciflow/inductor/162316 2025-09-07T07:38:46.5706643Z * [new tag] ciflow/inductor/162318 -> ciflow/inductor/162318 2025-09-07T07:38:46.5706939Z * [new tag] ciflow/inductor/162323 -> ciflow/inductor/162323 2025-09-07T07:38:46.5707375Z * [new tag] ciflow/inductor/162341 -> ciflow/inductor/162341 2025-09-07T07:38:46.5707754Z * [new tag] ciflow/inductor/162345 -> ciflow/inductor/162345 2025-09-07T07:38:46.5708352Z * [new tag] ciflow/inductor/3b9a386 -> ciflow/inductor/3b9a386 2025-09-07T07:38:46.5708667Z * [new tag] ciflow/inductor/3d4b92b -> ciflow/inductor/3d4b92b 2025-09-07T07:38:46.5709101Z * [new tag] ciflow/inductor/d224ac7 -> ciflow/inductor/d224ac7 2025-09-07T07:38:46.5709555Z * [new tag] ciflow/linux-aarch64/157994 -> ciflow/linux-aarch64/157994 2025-09-07T07:38:46.5709908Z * [new tag] ciflow/linux-aarch64/159737 -> ciflow/linux-aarch64/159737 2025-09-07T07:38:46.5710193Z * [new tag] ciflow/linux-aarch64/160078 -> ciflow/linux-aarch64/160078 2025-09-07T07:38:46.5710638Z * [new tag] ciflow/mps/157553 -> ciflow/mps/157553 2025-09-07T07:38:46.5710941Z * [new tag] ciflow/mps/157635 -> ciflow/mps/157635 2025-09-07T07:38:46.5711232Z * [new tag] ciflow/mps/161988 -> ciflow/mps/161988 2025-09-07T07:38:46.5711565Z * [new tag] ciflow/mps/162108 -> ciflow/mps/162108 2025-09-07T07:38:46.5711870Z * [new tag] ciflow/mps/162153 -> ciflow/mps/162153 2025-09-07T07:38:46.5712276Z * [new tag] ciflow/mps/162281 -> ciflow/mps/162281 2025-09-07T07:38:46.5712658Z * [new tag] ciflow/nightly/156049 -> ciflow/nightly/156049 2025-09-07T07:38:46.5713001Z * [new tag] ciflow/nightly/158104 -> ciflow/nightly/158104 2025-09-07T07:38:46.5713416Z * [new tag] ciflow/op-benchmark/157994 -> ciflow/op-benchmark/157994 2025-09-07T07:38:46.5714022Z * [new tag] ciflow/periodic-rocm-mi300/161529 -> ciflow/periodic-rocm-mi300/161529 2025-09-07T07:38:46.5714261Z * [new tag] ciflow/periodic-rocm-mi300/161715 -> ciflow/periodic-rocm-mi300/161715 2025-09-07T07:38:46.5714830Z * [new tag] ciflow/periodic/054a2fd -> ciflow/periodic/054a2fd 2025-09-07T07:38:46.5715093Z * [new tag] ciflow/periodic/156703 -> ciflow/periodic/156703 2025-09-07T07:38:46.5715375Z * [new tag] ciflow/periodic/161715 -> ciflow/periodic/161715 2025-09-07T07:38:46.5715658Z * [new tag] ciflow/periodic/162021 -> ciflow/periodic/162021 2025-09-07T07:38:46.5715987Z * [new tag] ciflow/periodic/162323 -> ciflow/periodic/162323 2025-09-07T07:38:46.5716434Z * [new tag] ciflow/periodic/2a6d37d -> ciflow/periodic/2a6d37d 2025-09-07T07:38:46.5716845Z * [new tag] ciflow/periodic/317eeb8 -> ciflow/periodic/317eeb8 2025-09-07T07:38:46.5717234Z * [new tag] ciflow/periodic/3c32 -> ciflow/periodic/3c32 2025-09-07T07:38:46.5717817Z * [new tag] ciflow/periodic/3e98831 -> ciflow/periodic/3e98831 2025-09-07T07:38:46.5718171Z * [new tag] ciflow/periodic/94512-point -> ciflow/periodic/94512-point 2025-09-07T07:38:46.5718863Z * [new tag] ciflow/periodic/csl/test87519 -> ciflow/periodic/csl/test87519 2025-09-07T07:38:46.5719558Z * [new tag] ciflow/periodic/csltest88275 -> ciflow/periodic/csltest88275 2025-09-07T07:38:46.5720145Z * [new tag] ciflow/periodic/csltest88761 -> ciflow/periodic/csltest88761 2025-09-07T07:38:46.5720514Z * [new tag] ciflow/periodic/release_1.12 -> ciflow/periodic/release_1.12 2025-09-07T07:38:46.5721131Z * [new tag] ciflow/periodic/release_1.12.0 -> ciflow/periodic/release_1.12.0 2025-09-07T07:38:46.5721598Z * [new tag] ciflow/periodic/sha-ec5b83 -> ciflow/periodic/sha-ec5b83 2025-09-07T07:38:46.5722376Z * [new tag] ciflow/rocm-mi300/154170 -> ciflow/rocm-mi300/154170 2025-09-07T07:38:46.5722662Z * [new tag] ciflow/rocm-mi300/158747 -> ciflow/rocm-mi300/158747 2025-09-07T07:38:46.5723012Z * [new tag] ciflow/rocm-mi300/159146 -> ciflow/rocm-mi300/159146 2025-09-07T07:38:46.5723285Z * [new tag] ciflow/rocm-mi300/159158 -> ciflow/rocm-mi300/159158 2025-09-07T07:38:46.5723732Z * [new tag] ciflow/rocm-mi300/161715 -> ciflow/rocm-mi300/161715 2025-09-07T07:38:46.5724119Z * [new tag] ciflow/rocm-mi300/161957 -> ciflow/rocm-mi300/161957 2025-09-07T07:38:46.5724412Z * [new tag] ciflow/rocm-mi300/162053 -> ciflow/rocm-mi300/162053 2025-09-07T07:38:46.5724697Z * [new tag] ciflow/rocm-mi300/162056 -> ciflow/rocm-mi300/162056 2025-09-07T07:38:46.5725127Z * [new tag] ciflow/rocm-mi300/162112 -> ciflow/rocm-mi300/162112 2025-09-07T07:38:46.5725463Z * [new tag] ciflow/rocm-mi300/162245 -> ciflow/rocm-mi300/162245 2025-09-07T07:38:46.5725747Z * [new tag] ciflow/rocm-mi300/162278 -> ciflow/rocm-mi300/162278 2025-09-07T07:38:46.5726168Z * [new tag] ciflow/rocm-mi300/162288 -> ciflow/rocm-mi300/162288 2025-09-07T07:38:46.5726566Z * [new tag] ciflow/rocm-mi355/162053 -> ciflow/rocm-mi355/162053 2025-09-07T07:38:46.5726880Z * [new tag] ciflow/rocm-mi355/162056 -> ciflow/rocm-mi355/162056 2025-09-07T07:38:46.5727300Z * [new tag] ciflow/rocm/148492 -> ciflow/rocm/148492 2025-09-07T07:38:46.5727627Z * [new tag] ciflow/rocm/154170 -> ciflow/rocm/154170 2025-09-07T07:38:46.5728059Z * [new tag] ciflow/rocm/156491 -> ciflow/rocm/156491 2025-09-07T07:38:46.5728345Z * [new tag] ciflow/rocm/156592 -> ciflow/rocm/156592 2025-09-07T07:38:46.5728695Z * [new tag] ciflow/rocm/158747 -> ciflow/rocm/158747 2025-09-07T07:38:46.5728964Z * [new tag] ciflow/rocm/159146 -> ciflow/rocm/159146 2025-09-07T07:38:46.5729528Z * [new tag] ciflow/rocm/159158 -> ciflow/rocm/159158 2025-09-07T07:38:46.5729749Z * [new tag] ciflow/rocm/161715 -> ciflow/rocm/161715 2025-09-07T07:38:46.5730157Z * [new tag] ciflow/rocm/161972 -> ciflow/rocm/161972 2025-09-07T07:38:46.5730467Z * [new tag] ciflow/rocm/162052 -> ciflow/rocm/162052 2025-09-07T07:38:46.5730929Z * [new tag] ciflow/rocm/162053 -> ciflow/rocm/162053 2025-09-07T07:38:46.5731350Z * [new tag] ciflow/rocm/162056 -> ciflow/rocm/162056 2025-09-07T07:38:46.5731777Z * [new tag] ciflow/rocm/162112 -> ciflow/rocm/162112 2025-09-07T07:38:46.5732193Z * [new tag] ciflow/rocm/162278 -> ciflow/rocm/162278 2025-09-07T07:38:46.5732510Z * [new tag] ciflow/rocm/162288 -> ciflow/rocm/162288 2025-09-07T07:38:46.5732865Z * [new tag] ciflow/rocm/162305 -> ciflow/rocm/162305 2025-09-07T07:38:46.5733424Z * [new tag] ciflow/slow/01c7106 -> ciflow/slow/01c7106 2025-09-07T07:38:46.5733725Z * [new tag] ciflow/slow/0577043 -> ciflow/slow/0577043 2025-09-07T07:38:46.5734492Z * [new tag] ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym -> ciflow/slow/0d5b74da0cab798fbfdb9caa53fad816999c8386-sdym 2025-09-07T07:38:46.5734759Z * [new tag] ciflow/slow/0e81104 -> ciflow/slow/0e81104 2025-09-07T07:38:46.5735075Z * [new tag] ciflow/slow/161395 -> ciflow/slow/161395 2025-09-07T07:38:46.5735477Z * [new tag] ciflow/slow/1732077 -> ciflow/slow/1732077 2025-09-07T07:38:46.5736043Z * [new tag] ciflow/slow/187eb7c -> ciflow/slow/187eb7c 2025-09-07T07:38:46.5736340Z * [new tag] ciflow/slow/1faef89 -> ciflow/slow/1faef89 2025-09-07T07:38:46.5737008Z * [new tag] ciflow/slow/3920ec1 -> ciflow/slow/3920ec1 2025-09-07T07:38:46.5737571Z * [new tag] ciflow/slow/3b7c6b2 -> ciflow/slow/3b7c6b2 2025-09-07T07:38:46.5737885Z * [new tag] ciflow/slow/59a3759 -> ciflow/slow/59a3759 2025-09-07T07:38:46.5738326Z * [new tag] ciflow/slow/70ef0bb -> ciflow/slow/70ef0bb 2025-09-07T07:38:46.5738731Z * [new tag] ciflow/slow/788ff06 -> ciflow/slow/788ff06 2025-09-07T07:38:46.5739530Z * [new tag] ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym -> ciflow/slow/8751002215790a3a88750faa8f4366933e296693-sdym 2025-09-07T07:38:46.5739705Z * [new tag] ciflow/slow/9d85864 -> ciflow/slow/9d85864 2025-09-07T07:38:46.5740308Z * [new tag] ciflow/slow/9ffad5b -> ciflow/slow/9ffad5b 2025-09-07T07:38:46.5740553Z * [new tag] ciflow/slow/a206e8b -> ciflow/slow/a206e8b 2025-09-07T07:38:46.5741036Z * [new tag] ciflow/slow/a837609 -> ciflow/slow/a837609 2025-09-07T07:38:46.5741467Z * [new tag] ciflow/slow/af841f3 -> ciflow/slow/af841f3 2025-09-07T07:38:46.5742236Z * [new tag] ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym -> ciflow/slow/da3aba1e46157c4df504b067477cdf2b3c96b194-sdym 2025-09-07T07:38:46.5742450Z * [new tag] ciflow/triton_binaries/162329 -> ciflow/triton_binaries/162329 2025-09-07T07:38:46.5742782Z * [new tag] ciflow/trunk/113258 -> ciflow/trunk/113258 2025-09-07T07:38:46.5743140Z * [new tag] ciflow/trunk/137400 -> ciflow/trunk/137400 2025-09-07T07:38:46.5743580Z * [new tag] ciflow/trunk/148180 -> ciflow/trunk/148180 2025-09-07T07:38:46.5743853Z * [new tag] ciflow/trunk/148328 -> ciflow/trunk/148328 2025-09-07T07:38:46.5744178Z * [new tag] ciflow/trunk/148492 -> ciflow/trunk/148492 2025-09-07T07:38:46.5744747Z * [new tag] ciflow/trunk/148919 -> ciflow/trunk/148919 2025-09-07T07:38:46.5744948Z * [new tag] ciflow/trunk/152624 -> ciflow/trunk/152624 2025-09-07T07:38:46.5745293Z * [new tag] ciflow/trunk/154170 -> ciflow/trunk/154170 2025-09-07T07:38:46.5746747Z * [new tag] ciflow/trunk/154694 -> ciflow/trunk/154694 2025-09-07T07:38:46.5746893Z * [new tag] ciflow/trunk/156049 -> ciflow/trunk/156049 2025-09-07T07:38:46.5746996Z * [new tag] ciflow/trunk/156703 -> ciflow/trunk/156703 2025-09-07T07:38:46.5747420Z * [new tag] ciflow/trunk/156711 -> ciflow/trunk/156711 2025-09-07T07:38:46.5747566Z * [new tag] ciflow/trunk/157432 -> ciflow/trunk/157432 2025-09-07T07:38:46.5747872Z * [new tag] ciflow/trunk/157685 -> ciflow/trunk/157685 2025-09-07T07:38:46.5748233Z * [new tag] ciflow/trunk/157689 -> ciflow/trunk/157689 2025-09-07T07:38:46.5748532Z * [new tag] ciflow/trunk/157699 -> ciflow/trunk/157699 2025-09-07T07:38:46.5748885Z * [new tag] ciflow/trunk/157813 -> ciflow/trunk/157813 2025-09-07T07:38:46.5749184Z * [new tag] ciflow/trunk/157994 -> ciflow/trunk/157994 2025-09-07T07:38:46.5749543Z * [new tag] ciflow/trunk/158091 -> ciflow/trunk/158091 2025-09-07T07:38:46.5749845Z * [new tag] ciflow/trunk/158104 -> ciflow/trunk/158104 2025-09-07T07:38:46.5750224Z * [new tag] ciflow/trunk/158404 -> ciflow/trunk/158404 2025-09-07T07:38:46.5750597Z * [new tag] ciflow/trunk/158647 -> ciflow/trunk/158647 2025-09-07T07:38:46.5751099Z * [new tag] ciflow/trunk/158846 -> ciflow/trunk/158846 2025-09-07T07:38:46.5751353Z * [new tag] ciflow/trunk/159158 -> ciflow/trunk/159158 2025-09-07T07:38:46.5751773Z * [new tag] ciflow/trunk/159682 -> ciflow/trunk/159682 2025-09-07T07:38:46.5752082Z * [new tag] ciflow/trunk/159835 -> ciflow/trunk/159835 2025-09-07T07:38:46.5752425Z * [new tag] ciflow/trunk/160161 -> ciflow/trunk/160161 2025-09-07T07:38:46.5752756Z * [new tag] ciflow/trunk/160236 -> ciflow/trunk/160236 2025-09-07T07:38:46.5753094Z * [new tag] ciflow/trunk/160329 -> ciflow/trunk/160329 2025-09-07T07:38:46.5753469Z * [new tag] ciflow/trunk/160480 -> ciflow/trunk/160480 2025-09-07T07:38:46.5753747Z * [new tag] ciflow/trunk/160532 -> ciflow/trunk/160532 2025-09-07T07:38:46.5754317Z * [new tag] ciflow/trunk/160836 -> ciflow/trunk/160836 2025-09-07T07:38:46.5754554Z * [new tag] ciflow/trunk/160843 -> ciflow/trunk/160843 2025-09-07T07:38:46.5754896Z * [new tag] ciflow/trunk/160869 -> ciflow/trunk/160869 2025-09-07T07:38:46.5755241Z * [new tag] ciflow/trunk/160928 -> ciflow/trunk/160928 2025-09-07T07:38:46.5755609Z * [new tag] ciflow/trunk/160940 -> ciflow/trunk/160940 2025-09-07T07:38:46.5755939Z * [new tag] ciflow/trunk/160943 -> ciflow/trunk/160943 2025-09-07T07:38:46.5756439Z * [new tag] ciflow/trunk/160953 -> ciflow/trunk/160953 2025-09-07T07:38:46.5756827Z * [new tag] ciflow/trunk/161035 -> ciflow/trunk/161035 2025-09-07T07:38:46.5757127Z * [new tag] ciflow/trunk/161178 -> ciflow/trunk/161178 2025-09-07T07:38:46.5757486Z * [new tag] ciflow/trunk/161349 -> ciflow/trunk/161349 2025-09-07T07:38:46.5757788Z * [new tag] ciflow/trunk/161350 -> ciflow/trunk/161350 2025-09-07T07:38:46.5758119Z * [new tag] ciflow/trunk/161351 -> ciflow/trunk/161351 2025-09-07T07:38:46.5758449Z * [new tag] ciflow/trunk/161395 -> ciflow/trunk/161395 2025-09-07T07:38:46.5758842Z * [new tag] ciflow/trunk/161405 -> ciflow/trunk/161405 2025-09-07T07:38:46.5759153Z * [new tag] ciflow/trunk/161406 -> ciflow/trunk/161406 2025-09-07T07:38:46.5759503Z * [new tag] ciflow/trunk/161410 -> ciflow/trunk/161410 2025-09-07T07:38:46.5759857Z * [new tag] ciflow/trunk/161468 -> ciflow/trunk/161468 2025-09-07T07:38:46.5760165Z * [new tag] ciflow/trunk/161499 -> ciflow/trunk/161499 2025-09-07T07:38:46.5760712Z * [new tag] ciflow/trunk/161527 -> ciflow/trunk/161527 2025-09-07T07:38:46.5761047Z * [new tag] ciflow/trunk/161534 -> ciflow/trunk/161534 2025-09-07T07:38:46.5761342Z * [new tag] ciflow/trunk/161591 -> ciflow/trunk/161591 2025-09-07T07:38:46.5761652Z * [new tag] ciflow/trunk/161595 -> ciflow/trunk/161595 2025-09-07T07:38:46.5762003Z * [new tag] ciflow/trunk/161596 -> ciflow/trunk/161596 2025-09-07T07:38:46.5762312Z * [new tag] ciflow/trunk/161633 -> ciflow/trunk/161633 2025-09-07T07:38:46.5762645Z * [new tag] ciflow/trunk/161634 -> ciflow/trunk/161634 2025-09-07T07:38:46.5762965Z * [new tag] ciflow/trunk/161635 -> ciflow/trunk/161635 2025-09-07T07:38:46.5763322Z * [new tag] ciflow/trunk/161667 -> ciflow/trunk/161667 2025-09-07T07:38:46.5763690Z * [new tag] ciflow/trunk/161670 -> ciflow/trunk/161670 2025-09-07T07:38:46.5764016Z * [new tag] ciflow/trunk/161692 -> ciflow/trunk/161692 2025-09-07T07:38:46.5764461Z * [new tag] ciflow/trunk/161693 -> ciflow/trunk/161693 2025-09-07T07:38:46.5764764Z * [new tag] ciflow/trunk/161695 -> ciflow/trunk/161695 2025-09-07T07:38:46.5765126Z * [new tag] ciflow/trunk/161730 -> ciflow/trunk/161730 2025-09-07T07:38:46.5765474Z * [new tag] ciflow/trunk/161744 -> ciflow/trunk/161744 2025-09-07T07:38:46.5765803Z * [new tag] ciflow/trunk/161749 -> ciflow/trunk/161749 2025-09-07T07:38:46.5766155Z * [new tag] ciflow/trunk/161881 -> ciflow/trunk/161881 2025-09-07T07:38:46.5766451Z * [new tag] ciflow/trunk/161924 -> ciflow/trunk/161924 2025-09-07T07:38:46.5766892Z * [new tag] ciflow/trunk/161926 -> ciflow/trunk/161926 2025-09-07T07:38:46.5767232Z * [new tag] ciflow/trunk/161936 -> ciflow/trunk/161936 2025-09-07T07:38:46.5767566Z * [new tag] ciflow/trunk/161952 -> ciflow/trunk/161952 2025-09-07T07:38:46.5767943Z * [new tag] ciflow/trunk/161955 -> ciflow/trunk/161955 2025-09-07T07:38:46.5768320Z * [new tag] ciflow/trunk/161957 -> ciflow/trunk/161957 2025-09-07T07:38:46.5768987Z * [new tag] ciflow/trunk/161959 -> ciflow/trunk/161959 2025-09-07T07:38:46.5769325Z * [new tag] ciflow/trunk/161977 -> ciflow/trunk/161977 2025-09-07T07:38:46.5769555Z * [new tag] ciflow/trunk/161988 -> ciflow/trunk/161988 2025-09-07T07:38:46.5769916Z * [new tag] ciflow/trunk/161994 -> ciflow/trunk/161994 2025-09-07T07:38:46.5770362Z * [new tag] ciflow/trunk/162007 -> ciflow/trunk/162007 2025-09-07T07:38:46.5770667Z * [new tag] ciflow/trunk/162013 -> ciflow/trunk/162013 2025-09-07T07:38:46.5771013Z * [new tag] ciflow/trunk/162017 -> ciflow/trunk/162017 2025-09-07T07:38:46.5771316Z * [new tag] ciflow/trunk/162021 -> ciflow/trunk/162021 2025-09-07T07:38:46.5771666Z * [new tag] ciflow/trunk/162022 -> ciflow/trunk/162022 2025-09-07T07:38:46.5772006Z * [new tag] ciflow/trunk/162040 -> ciflow/trunk/162040 2025-09-07T07:38:46.5772335Z * [new tag] ciflow/trunk/162041 -> ciflow/trunk/162041 2025-09-07T07:38:46.5772742Z * [new tag] ciflow/trunk/162062 -> ciflow/trunk/162062 2025-09-07T07:38:46.5773099Z * [new tag] ciflow/trunk/162066 -> ciflow/trunk/162066 2025-09-07T07:38:46.5773404Z * [new tag] ciflow/trunk/162089 -> ciflow/trunk/162089 2025-09-07T07:38:46.5773750Z * [new tag] ciflow/trunk/162099 -> ciflow/trunk/162099 2025-09-07T07:38:46.5774064Z * [new tag] ciflow/trunk/162104 -> ciflow/trunk/162104 2025-09-07T07:38:46.5774395Z * [new tag] ciflow/trunk/162106 -> ciflow/trunk/162106 2025-09-07T07:38:46.5774716Z * [new tag] ciflow/trunk/162112 -> ciflow/trunk/162112 2025-09-07T07:38:46.5775270Z * [new tag] ciflow/trunk/162119 -> ciflow/trunk/162119 2025-09-07T07:38:46.5775522Z * [new tag] ciflow/trunk/162142 -> ciflow/trunk/162142 2025-09-07T07:38:46.5775856Z * [new tag] ciflow/trunk/162169 -> ciflow/trunk/162169 2025-09-07T07:38:46.5776242Z * [new tag] ciflow/trunk/162183 -> ciflow/trunk/162183 2025-09-07T07:38:46.5776511Z * [new tag] ciflow/trunk/162190 -> ciflow/trunk/162190 2025-09-07T07:38:46.5776886Z * [new tag] ciflow/trunk/162194 -> ciflow/trunk/162194 2025-09-07T07:38:46.5777175Z * [new tag] ciflow/trunk/162200 -> ciflow/trunk/162200 2025-09-07T07:38:46.5777550Z * [new tag] ciflow/trunk/162206 -> ciflow/trunk/162206 2025-09-07T07:38:46.5777884Z * [new tag] ciflow/trunk/162208 -> ciflow/trunk/162208 2025-09-07T07:38:46.5778190Z * [new tag] ciflow/trunk/162222 -> ciflow/trunk/162222 2025-09-07T07:38:46.5778510Z * [new tag] ciflow/trunk/162238 -> ciflow/trunk/162238 2025-09-07T07:38:46.5778864Z * [new tag] ciflow/trunk/162244 -> ciflow/trunk/162244 2025-09-07T07:38:46.5779279Z * [new tag] ciflow/trunk/162267 -> ciflow/trunk/162267 2025-09-07T07:38:46.5779673Z * [new tag] ciflow/trunk/162269 -> ciflow/trunk/162269 2025-09-07T07:38:46.5780034Z * [new tag] ciflow/trunk/162278 -> ciflow/trunk/162278 2025-09-07T07:38:46.5780351Z * [new tag] ciflow/trunk/162286 -> ciflow/trunk/162286 2025-09-07T07:38:46.5780882Z * [new tag] ciflow/trunk/162288 -> ciflow/trunk/162288 2025-09-07T07:38:46.5781057Z * [new tag] ciflow/trunk/162293 -> ciflow/trunk/162293 2025-09-07T07:38:46.5781459Z * [new tag] ciflow/trunk/162310 -> ciflow/trunk/162310 2025-09-07T07:38:46.5781831Z * [new tag] ciflow/trunk/162311 -> ciflow/trunk/162311 2025-09-07T07:38:46.5782179Z * [new tag] ciflow/trunk/162315 -> ciflow/trunk/162315 2025-09-07T07:38:46.5782488Z * [new tag] ciflow/trunk/162325 -> ciflow/trunk/162325 2025-09-07T07:38:46.5782924Z * [new tag] ciflow/trunk/162328 -> ciflow/trunk/162328 2025-09-07T07:38:46.5783288Z * [new tag] ciflow/trunk/162329 -> ciflow/trunk/162329 2025-09-07T07:38:46.5783882Z * [new tag] ciflow/unstable/123 -> ciflow/unstable/123 2025-09-07T07:38:46.5784249Z * [new tag] ciflow/vllm/162292 -> ciflow/vllm/162292 2025-09-07T07:38:46.5784688Z * [new tag] ciflow/win-arm64/156049 -> ciflow/win-arm64/156049 2025-09-07T07:38:46.5785053Z * [new tag] ciflow/win-arm64/158104 -> ciflow/win-arm64/158104 2025-09-07T07:38:46.5785422Z * [new tag] ciflow/xpu/157699 -> ciflow/xpu/157699 2025-09-07T07:38:46.5785991Z * [new tag] ciflow/xpu/157994 -> ciflow/xpu/157994 2025-09-07T07:38:46.5786319Z * [new tag] ciflow/xpu/159459 -> ciflow/xpu/159459 2025-09-07T07:38:46.5786616Z * [new tag] ciflow/xpu/159718 -> ciflow/xpu/159718 2025-09-07T07:38:46.5786982Z * [new tag] ciflow/xpu/159944 -> ciflow/xpu/159944 2025-09-07T07:38:46.5787371Z * [new tag] ciflow/xpu/160867 -> ciflow/xpu/160867 2025-09-07T07:38:46.5787745Z * [new tag] ciflow/xpu/160938 -> ciflow/xpu/160938 2025-09-07T07:38:46.5788073Z * [new tag] ciflow/xpu/160940 -> ciflow/xpu/160940 2025-09-07T07:38:46.5788375Z * [new tag] ciflow/xpu/160953 -> ciflow/xpu/160953 2025-09-07T07:38:46.5788779Z * [new tag] ciflow/xpu/161045 -> ciflow/xpu/161045 2025-09-07T07:38:46.5789228Z * [new tag] ciflow/xpu/161058 -> ciflow/xpu/161058 2025-09-07T07:38:46.5789731Z * [new tag] ciflow/xpu/161246 -> ciflow/xpu/161246 2025-09-07T07:38:46.5790250Z * [new tag] ciflow/xpu/161397 -> ciflow/xpu/161397 2025-09-07T07:38:46.5790670Z * [new tag] ciflow/xpu/161485 -> ciflow/xpu/161485 2025-09-07T07:38:46.5791313Z * [new tag] ciflow/xpu/161988 -> ciflow/xpu/161988 2025-09-07T07:38:46.5791658Z * [new tag] ciflow/xpu/162062 -> ciflow/xpu/162062 2025-09-07T07:38:46.5791938Z * [new tag] cslpull75 -> cslpull75 2025-09-07T07:38:46.5792184Z * [new tag] cslpull76 -> cslpull76 2025-09-07T07:38:46.5792685Z * [new tag] cslpull77 -> cslpull77 2025-09-07T07:38:46.5793067Z * [new tag] cslpull78 -> cslpull78 2025-09-07T07:38:46.5793599Z * [new tag] cslpull79 -> cslpull79 2025-09-07T07:38:46.5794133Z * [new tag] cslpull80 -> cslpull80 2025-09-07T07:38:46.5794621Z * [new tag] cslpull81 -> cslpull81 2025-09-07T07:38:46.5794969Z * [new tag] cslpull82 -> cslpull82 2025-09-07T07:38:46.5795383Z * [new tag] cslpull83 -> cslpull83 2025-09-07T07:38:46.5795769Z * [new tag] cslpull84 -> cslpull84 2025-09-07T07:38:46.5796139Z * [new tag] cslpull85 -> cslpull85 2025-09-07T07:38:46.5796815Z * [new tag] cslpull86 -> cslpull86 2025-09-07T07:38:46.5797205Z * [new tag] cslpull87 -> cslpull87 2025-09-07T07:38:46.5797485Z * [new tag] cslpull88 -> cslpull88 2025-09-07T07:38:46.5798037Z * [new tag] cslpull89 -> cslpull89 2025-09-07T07:38:46.5798353Z * [new tag] cslpull90 -> cslpull90 2025-09-07T07:38:46.5799211Z * [new tag] cslpull91 -> cslpull91 2025-09-07T07:38:46.5799641Z * [new tag] cslpull92 -> cslpull92 2025-09-07T07:38:46.5800070Z * [new tag] flight_5 -> flight_5 2025-09-07T07:38:46.5800430Z * [new tag] flight_5.1 -> flight_5.1 2025-09-07T07:38:46.5800880Z * [new tag] flight_5.2 -> flight_5.2 2025-09-07T07:38:46.5801296Z * [new tag] flight_5.3 -> flight_5.3 2025-09-07T07:38:46.5801643Z * [new tag] forpull1 -> forpull1 2025-09-07T07:38:46.5802647Z * [new tag] malfet/tag-2ef5611 -> malfet/tag-2ef5611 2025-09-07T07:38:46.5803042Z * [new tag] malfet/tag-317b1a0 -> malfet/tag-317b1a0 2025-09-07T07:38:46.5803443Z * [new tag] malfet/tag-ec6f767 -> malfet/tag-ec6f767 2025-09-07T07:38:46.5803935Z * [new tag] nightly-binary -> nightly-binary 2025-09-07T07:38:46.5804213Z * [new tag] sqzhang_flight4_plus -> sqzhang_flight4_plus 2025-09-07T07:38:46.5804686Z * [new tag] sqzhang_flight_3 -> sqzhang_flight_3 2025-09-07T07:38:46.5805349Z * [new tag] trunk/00636e0171e7e733628c408084805442270cf608 -> trunk/00636e0171e7e733628c408084805442270cf608 2025-09-07T07:38:46.5805837Z * [new tag] trunk/019fed39aa6b2dd8c69347378d53423e5efae8d4 -> trunk/019fed39aa6b2dd8c69347378d53423e5efae8d4 2025-09-07T07:38:46.5806421Z * [new tag] trunk/01ab325cc2e0dc221af4d710974e1b9175066544 -> trunk/01ab325cc2e0dc221af4d710974e1b9175066544 2025-09-07T07:38:46.5807028Z * [new tag] trunk/01edcd4df8bf0c7b4cc2d3ec868bd2059eeea83b -> trunk/01edcd4df8bf0c7b4cc2d3ec868bd2059eeea83b 2025-09-07T07:38:46.5807798Z * [new tag] trunk/040d00af048967dde7938d358d7f5988cbd18388 -> trunk/040d00af048967dde7938d358d7f5988cbd18388 2025-09-07T07:38:46.5808316Z * [new tag] trunk/0447f2d99b4351b2ff129dce6eebb371024f73e5 -> trunk/0447f2d99b4351b2ff129dce6eebb371024f73e5 2025-09-07T07:38:46.5808769Z * [new tag] trunk/047603d35bdc70046216384838d6340feab79bf4 -> trunk/047603d35bdc70046216384838d6340feab79bf4 2025-09-07T07:38:46.5809231Z * [new tag] trunk/06da7c0730b3764f178ec3a90dedf4ffa4202d81 -> trunk/06da7c0730b3764f178ec3a90dedf4ffa4202d81 2025-09-07T07:38:46.5809707Z * [new tag] trunk/081cab045472ce045634548cc6c14a4870641e23 -> trunk/081cab045472ce045634548cc6c14a4870641e23 2025-09-07T07:38:46.5810163Z * [new tag] trunk/09587daf8c9f21f5340f73921ce5f23d1a4a4572 -> trunk/09587daf8c9f21f5340f73921ce5f23d1a4a4572 2025-09-07T07:38:46.5810622Z * [new tag] trunk/09be1890d72cc34fc946965dc4a27736bf0ca8c6 -> trunk/09be1890d72cc34fc946965dc4a27736bf0ca8c6 2025-09-07T07:38:46.5811078Z * [new tag] trunk/09d2f1b6315d6d416fbf452793d65795863ebc66 -> trunk/09d2f1b6315d6d416fbf452793d65795863ebc66 2025-09-07T07:38:46.5811540Z * [new tag] trunk/0af70e2353e1dcda83175fd4834ecb7b63e009e0 -> trunk/0af70e2353e1dcda83175fd4834ecb7b63e009e0 2025-09-07T07:38:46.5812250Z * [new tag] trunk/0c0e056a9e20c17271a6144dd32c0c7e3ba26736 -> trunk/0c0e056a9e20c17271a6144dd32c0c7e3ba26736 2025-09-07T07:38:46.5812946Z * [new tag] trunk/0cd6c56bdfa9178ff61be82ce3b178926ddb64a9 -> trunk/0cd6c56bdfa9178ff61be82ce3b178926ddb64a9 2025-09-07T07:38:46.5813421Z * [new tag] trunk/0d421ace32c1605ee8e452ee1eeb03bd243dd96c -> trunk/0d421ace32c1605ee8e452ee1eeb03bd243dd96c 2025-09-07T07:38:46.5813904Z * [new tag] trunk/0d71a9dd5b4b6d1dde58d91c9b71d96bc6a6a171 -> trunk/0d71a9dd5b4b6d1dde58d91c9b71d96bc6a6a171 2025-09-07T07:38:46.5814369Z * [new tag] trunk/0d84ff3b78f55492d3d4708458c92d776274939e -> trunk/0d84ff3b78f55492d3d4708458c92d776274939e 2025-09-07T07:38:46.5814826Z * [new tag] trunk/0f45aaf4414048b17d720d0915ce221a8de8ec63 -> trunk/0f45aaf4414048b17d720d0915ce221a8de8ec63 2025-09-07T07:38:46.5815288Z * [new tag] trunk/0ff8eabf1387de5acd6712a03bda61f1a3dfa27f -> trunk/0ff8eabf1387de5acd6712a03bda61f1a3dfa27f 2025-09-07T07:38:46.5815753Z * [new tag] trunk/104f2680e03d13a4765ca69f905d8f16fc0c822f -> trunk/104f2680e03d13a4765ca69f905d8f16fc0c822f 2025-09-07T07:38:46.5816218Z * [new tag] trunk/12814701555d3e41dfcdf8f9273af5821e322df0 -> trunk/12814701555d3e41dfcdf8f9273af5821e322df0 2025-09-07T07:38:46.5816678Z * [new tag] trunk/13b65196db422bdb394cb482e208c61ed448898c -> trunk/13b65196db422bdb394cb482e208c61ed448898c 2025-09-07T07:38:46.5817145Z * [new tag] trunk/13d66e2a66eceed14b8a8f5a971087df4f688a46 -> trunk/13d66e2a66eceed14b8a8f5a971087df4f688a46 2025-09-07T07:38:46.5817616Z * [new tag] trunk/145a3a7bda15e3963a33eb1b54bba5d4a270b225 -> trunk/145a3a7bda15e3963a33eb1b54bba5d4a270b225 2025-09-07T07:38:46.5818069Z * [new tag] trunk/146371483318e17929daefd37c8e459d9d6d47bb -> trunk/146371483318e17929daefd37c8e459d9d6d47bb 2025-09-07T07:38:46.5818522Z * [new tag] trunk/15c77a8cfd341e74fd124b077492ef2bfa51b339 -> trunk/15c77a8cfd341e74fd124b077492ef2bfa51b339 2025-09-07T07:38:46.5819019Z * [new tag] trunk/17fa8eec4a1e32939ab4d364ee6e75487a79b654 -> trunk/17fa8eec4a1e32939ab4d364ee6e75487a79b654 2025-09-07T07:38:46.5819495Z * [new tag] trunk/190c391a28845a14df26abb228d26aa813efb20c -> trunk/190c391a28845a14df26abb228d26aa813efb20c 2025-09-07T07:38:46.5819967Z * [new tag] trunk/1a588ace4667bde1331fbd8ed957157dca5cee68 -> trunk/1a588ace4667bde1331fbd8ed957157dca5cee68 2025-09-07T07:38:46.5820584Z * [new tag] trunk/1aa7476885e8f6e7b0ec3a5b6383aad9d3f343e7 -> trunk/1aa7476885e8f6e7b0ec3a5b6383aad9d3f343e7 2025-09-07T07:38:46.5821085Z * [new tag] trunk/1aeb421c342c9e9607842f4c87cb46e8e816ee53 -> trunk/1aeb421c342c9e9607842f4c87cb46e8e816ee53 2025-09-07T07:38:46.5821568Z * [new tag] trunk/1c1b28d5b6a942fafe23b2f09302d93c25226d4a -> trunk/1c1b28d5b6a942fafe23b2f09302d93c25226d4a 2025-09-07T07:38:46.5822055Z * [new tag] trunk/1ebd70d0c0d562d3be9abdee2a21906584af7d99 -> trunk/1ebd70d0c0d562d3be9abdee2a21906584af7d99 2025-09-07T07:38:46.5822561Z * [new tag] trunk/1ec2c15914da4ef7bd926ed9aebc8671c75fe965 -> trunk/1ec2c15914da4ef7bd926ed9aebc8671c75fe965 2025-09-07T07:38:46.5823021Z * [new tag] trunk/1f51056bd64e73d1aa81321bc3c098575b1bc78a -> trunk/1f51056bd64e73d1aa81321bc3c098575b1bc78a 2025-09-07T07:38:46.5823482Z * [new tag] trunk/1f820de639c75a1562d3fb03f160439f853ae07b -> trunk/1f820de639c75a1562d3fb03f160439f853ae07b 2025-09-07T07:38:46.5823940Z * [new tag] trunk/204697f0e695d82894c5010fbec664c4391f90cc -> trunk/204697f0e695d82894c5010fbec664c4391f90cc 2025-09-07T07:38:46.5824390Z * [new tag] trunk/20629b1619fe636227d01fc85ba221daa7185a05 -> trunk/20629b1619fe636227d01fc85ba221daa7185a05 2025-09-07T07:38:46.5824852Z * [new tag] trunk/20b47acef845e9c4f71da9429a396d293f50ebe7 -> trunk/20b47acef845e9c4f71da9429a396d293f50ebe7 2025-09-07T07:38:46.5825325Z * [new tag] trunk/20bfb2539d7c5250379648eda35f80b8a7d642dd -> trunk/20bfb2539d7c5250379648eda35f80b8a7d642dd 2025-09-07T07:38:46.5825854Z * [new tag] trunk/21fae99c180d17def562797ea0fb154d8fdf88e3 -> trunk/21fae99c180d17def562797ea0fb154d8fdf88e3 2025-09-07T07:38:46.5826320Z * [new tag] trunk/248355faf53f9f7ba2fd0a367d59600c6d991e7f -> trunk/248355faf53f9f7ba2fd0a367d59600c6d991e7f 2025-09-07T07:38:46.5826784Z * [new tag] trunk/25f4aaed9ec26f39c13862323ff8582006473d23 -> trunk/25f4aaed9ec26f39c13862323ff8582006473d23 2025-09-07T07:38:46.5827249Z * [new tag] trunk/261a84a1764412f8e659c956e3f81997ec3de9d5 -> trunk/261a84a1764412f8e659c956e3f81997ec3de9d5 2025-09-07T07:38:46.5827705Z * [new tag] trunk/28f4ab0737937858730f29f5c4e601e109cf9d5f -> trunk/28f4ab0737937858730f29f5c4e601e109cf9d5f 2025-09-07T07:38:46.5828172Z * [new tag] trunk/291cd11f2d5df6f48d348cce0e4e762f274f4dc4 -> trunk/291cd11f2d5df6f48d348cce0e4e762f274f4dc4 2025-09-07T07:38:46.5828636Z * [new tag] trunk/29280864d941e6108ab57f7298f520c0cf9696e9 -> trunk/29280864d941e6108ab57f7298f520c0cf9696e9 2025-09-07T07:38:46.5829089Z * [new tag] trunk/2a45837e98c63cae9d1a2e2133a727b829e549d5 -> trunk/2a45837e98c63cae9d1a2e2133a727b829e549d5 2025-09-07T07:38:46.5829550Z * [new tag] trunk/2a5c0785e2f975697fd7bdf1411de6e03dcaa1ef -> trunk/2a5c0785e2f975697fd7bdf1411de6e03dcaa1ef 2025-09-07T07:38:46.5830017Z * [new tag] trunk/2b8a83901c58a0858ea9e4ce00055f48e6ed164c -> trunk/2b8a83901c58a0858ea9e4ce00055f48e6ed164c 2025-09-07T07:38:46.5830469Z * [new tag] trunk/2ba65472dd54488a86a50326ea990195fc6732d6 -> trunk/2ba65472dd54488a86a50326ea990195fc6732d6 2025-09-07T07:38:46.5830932Z * [new tag] trunk/2c03f0acc53ed13fe8ebfe809129f25996e009a0 -> trunk/2c03f0acc53ed13fe8ebfe809129f25996e009a0 2025-09-07T07:38:46.5831451Z * [new tag] trunk/2dd529df0092799f68ee7afcf52338276906706a -> trunk/2dd529df0092799f68ee7afcf52338276906706a 2025-09-07T07:38:46.5831912Z * [new tag] trunk/2f6b4b1ad3f82bb3bd984f6e65744ea339ffb8b5 -> trunk/2f6b4b1ad3f82bb3bd984f6e65744ea339ffb8b5 2025-09-07T07:38:46.5832379Z * [new tag] trunk/2fa0520a64ed8aa734a56c4d124958f0b5711ca8 -> trunk/2fa0520a64ed8aa734a56c4d124958f0b5711ca8 2025-09-07T07:38:46.5832844Z * [new tag] trunk/302df2ac5dc4222294c09d48804a2dddb8f4bad8 -> trunk/302df2ac5dc4222294c09d48804a2dddb8f4bad8 2025-09-07T07:38:46.5833309Z * [new tag] trunk/33028597bfa2e0178e28c8cce33cb9b3800cac43 -> trunk/33028597bfa2e0178e28c8cce33cb9b3800cac43 2025-09-07T07:38:46.5833761Z * [new tag] trunk/34aa78274d6770086025a967fa63a86830e08176 -> trunk/34aa78274d6770086025a967fa63a86830e08176 2025-09-07T07:38:46.5834233Z * [new tag] trunk/3559c354ce6a14d11fe29fb12fa2747a2f2af449 -> trunk/3559c354ce6a14d11fe29fb12fa2747a2f2af449 2025-09-07T07:38:46.5834703Z * [new tag] trunk/36d207fcaaede0d1e58a5168084c307b32b6fd8b -> trunk/36d207fcaaede0d1e58a5168084c307b32b6fd8b 2025-09-07T07:38:46.5835167Z * [new tag] trunk/377033757ae5ca524ea842f1b0a5f446ed3d8fe0 -> trunk/377033757ae5ca524ea842f1b0a5f446ed3d8fe0 2025-09-07T07:38:46.5835630Z * [new tag] trunk/3771380f83fcac154a7c89ad679311d8c4818287 -> trunk/3771380f83fcac154a7c89ad679311d8c4818287 2025-09-07T07:38:46.5836089Z * [new tag] trunk/3a207816cc569f78863d86c01f2a3d265350e39f -> trunk/3a207816cc569f78863d86c01f2a3d265350e39f 2025-09-07T07:38:46.5836551Z * [new tag] trunk/3a20a20e7065ec927fdd216d4da3b04f879b3c67 -> trunk/3a20a20e7065ec927fdd216d4da3b04f879b3c67 2025-09-07T07:38:46.5837024Z * [new tag] trunk/3bbc2e3e4f025523eaa5dbff220b3e96bca608d0 -> trunk/3bbc2e3e4f025523eaa5dbff220b3e96bca608d0 2025-09-07T07:38:46.5837493Z * [new tag] trunk/3c0ff1b569c45cfa6935ad8031a9d4cf1551aa3f -> trunk/3c0ff1b569c45cfa6935ad8031a9d4cf1551aa3f 2025-09-07T07:38:46.5838003Z * [new tag] trunk/3c45af079afc92a03b03ddf4f9198902ffcf30cf -> trunk/3c45af079afc92a03b03ddf4f9198902ffcf30cf 2025-09-07T07:38:46.5838472Z * [new tag] trunk/3dde5d7f9bf80dd6623a712bc429e9e4302464b5 -> trunk/3dde5d7f9bf80dd6623a712bc429e9e4302464b5 2025-09-07T07:38:46.5838940Z * [new tag] trunk/403a3a393cda7e60f503f3b04b8805a845dcf45d -> trunk/403a3a393cda7e60f503f3b04b8805a845dcf45d 2025-09-07T07:38:46.5839400Z * [new tag] trunk/420c52ecf36f86d32da0853bfbe074b682b070aa -> trunk/420c52ecf36f86d32da0853bfbe074b682b070aa 2025-09-07T07:38:46.5839868Z * [new tag] trunk/43b7c86a2c0f91320f5c5f4827b111edff06fdb6 -> trunk/43b7c86a2c0f91320f5c5f4827b111edff06fdb6 2025-09-07T07:38:46.5840315Z * [new tag] trunk/451ed931562ec8b46d1f7e6c266a68132a119336 -> trunk/451ed931562ec8b46d1f7e6c266a68132a119336 2025-09-07T07:38:46.5840769Z * [new tag] trunk/480c7391126656154318fabf1d57ebc01e196e63 -> trunk/480c7391126656154318fabf1d57ebc01e196e63 2025-09-07T07:38:46.5841230Z * [new tag] trunk/48bedd753da22634aa94fbafeb731e82025404f3 -> trunk/48bedd753da22634aa94fbafeb731e82025404f3 2025-09-07T07:38:46.5841703Z * [new tag] trunk/494878a11b79071ada0b98f34042d47155be6d1c -> trunk/494878a11b79071ada0b98f34042d47155be6d1c 2025-09-07T07:38:46.5842157Z * [new tag] trunk/4ae57d448c0a7d37e4cfd5c27d977fad2cef4051 -> trunk/4ae57d448c0a7d37e4cfd5c27d977fad2cef4051 2025-09-07T07:38:46.5842628Z * [new tag] trunk/4cdaf8265d86f984254b62052da8c26ef61ef1cf -> trunk/4cdaf8265d86f984254b62052da8c26ef61ef1cf 2025-09-07T07:38:46.5843090Z * [new tag] trunk/4d4abec80f03cd8fdefe1d9cb3a60d3690cd777e -> trunk/4d4abec80f03cd8fdefe1d9cb3a60d3690cd777e 2025-09-07T07:38:46.5843571Z * [new tag] trunk/4e42aa8ffc44b8340eb0eeaf80a2cafc4763a186 -> trunk/4e42aa8ffc44b8340eb0eeaf80a2cafc4763a186 2025-09-07T07:38:46.5844089Z * [new tag] trunk/4f72d932feee0749397fec876dcd43994f50b215 -> trunk/4f72d932feee0749397fec876dcd43994f50b215 2025-09-07T07:38:46.5844560Z * [new tag] trunk/50fc22dedf3c4a27be61fa05551c4f320281b42d -> trunk/50fc22dedf3c4a27be61fa05551c4f320281b42d 2025-09-07T07:38:46.5845018Z * [new tag] trunk/5211f1f908907ffc064b56e43cf8659f7fc22aa9 -> trunk/5211f1f908907ffc064b56e43cf8659f7fc22aa9 2025-09-07T07:38:46.5845471Z * [new tag] trunk/524b78d4f67045b83bb69edc56ab16efe282971c -> trunk/524b78d4f67045b83bb69edc56ab16efe282971c 2025-09-07T07:38:46.5845940Z * [new tag] trunk/54e275e0d81fe1e1ccfa4fb5f2a5a9aaca00ca15 -> trunk/54e275e0d81fe1e1ccfa4fb5f2a5a9aaca00ca15 2025-09-07T07:38:46.5846426Z * [new tag] trunk/5561e45758d59c94605873d5db48ed459c004c3b -> trunk/5561e45758d59c94605873d5db48ed459c004c3b 2025-09-07T07:38:46.5846881Z * [new tag] trunk/57278d45f046d4f89f45d373b1af4dd56934ff24 -> trunk/57278d45f046d4f89f45d373b1af4dd56934ff24 2025-09-07T07:38:46.5847337Z * [new tag] trunk/5927a70934ccf7b70182d364c23245a7dd685503 -> trunk/5927a70934ccf7b70182d364c23245a7dd685503 2025-09-07T07:38:46.5847792Z * [new tag] trunk/5985e28912aeb40b103ebfcf2fd0665eb4a50599 -> trunk/5985e28912aeb40b103ebfcf2fd0665eb4a50599 2025-09-07T07:38:46.5848249Z * [new tag] trunk/5a2da090ed6db88bb657c4e51ec0b310cd08bff6 -> trunk/5a2da090ed6db88bb657c4e51ec0b310cd08bff6 2025-09-07T07:38:46.5848733Z * [new tag] trunk/5c473e9f5ee0ef0fc38e6cf34a95b547f8cdc8d5 -> trunk/5c473e9f5ee0ef0fc38e6cf34a95b547f8cdc8d5 2025-09-07T07:38:46.5849195Z * [new tag] trunk/5c67426d6847667a7c55a2dd01f470fa37238c18 -> trunk/5c67426d6847667a7c55a2dd01f470fa37238c18 2025-09-07T07:38:46.5849653Z * [new tag] trunk/5da573c42c332bc68d4b7946c69f690a876d951a -> trunk/5da573c42c332bc68d4b7946c69f690a876d951a 2025-09-07T07:38:46.5850151Z * [new tag] trunk/5e5870e858f60ff4bf87d03f3592097e934a9580 -> trunk/5e5870e858f60ff4bf87d03f3592097e934a9580 2025-09-07T07:38:46.5850642Z * [new tag] trunk/5f3cbc9442aa55b5afb29f4ac8ca9be569003e84 -> trunk/5f3cbc9442aa55b5afb29f4ac8ca9be569003e84 2025-09-07T07:38:46.5851125Z * [new tag] trunk/600c25e9a17fe56e3dee872be8854db08916ba0c -> trunk/600c25e9a17fe56e3dee872be8854db08916ba0c 2025-09-07T07:38:46.5851610Z * [new tag] trunk/601ae8e4831fc8123fffcfb8fd2e6b6381b42e14 -> trunk/601ae8e4831fc8123fffcfb8fd2e6b6381b42e14 2025-09-07T07:38:46.5852083Z * [new tag] trunk/6087ef41e54c2494b117ffd923faf20f515a6806 -> trunk/6087ef41e54c2494b117ffd923faf20f515a6806 2025-09-07T07:38:46.5852556Z * [new tag] trunk/626cb7df8161dd4ecb4fe43b60f37ce9076f56b1 -> trunk/626cb7df8161dd4ecb4fe43b60f37ce9076f56b1 2025-09-07T07:38:46.5853039Z * [new tag] trunk/62c3f9a97fd3dea7132a93066d32d893ffe101e6 -> trunk/62c3f9a97fd3dea7132a93066d32d893ffe101e6 2025-09-07T07:38:46.5853576Z * [new tag] trunk/63a9c23fe99eacfd09610c36dfe8f01b053c1a35 -> trunk/63a9c23fe99eacfd09610c36dfe8f01b053c1a35 2025-09-07T07:38:46.5854019Z * [new tag] trunk/65985937d97505f648b6ed852c3129f2dd08b251 -> trunk/65985937d97505f648b6ed852c3129f2dd08b251 2025-09-07T07:38:46.5854473Z * [new tag] trunk/66f3b4a682a6153517dd23369fdc3289b6494b07 -> trunk/66f3b4a682a6153517dd23369fdc3289b6494b07 2025-09-07T07:38:46.5854920Z * [new tag] trunk/6737e2c996990024187ba620d2764f3b6f6add2c -> trunk/6737e2c996990024187ba620d2764f3b6f6add2c 2025-09-07T07:38:46.5855372Z * [new tag] trunk/67c31dcd364f10072a55f4a30ffd1151c686283a -> trunk/67c31dcd364f10072a55f4a30ffd1151c686283a 2025-09-07T07:38:46.5855833Z * [new tag] trunk/68738beff73e9c3512e18b4edea811a897ce42db -> trunk/68738beff73e9c3512e18b4edea811a897ce42db 2025-09-07T07:38:46.5856325Z * [new tag] trunk/69a25f68884a168550695fdb1a7c310c54d29536 -> trunk/69a25f68884a168550695fdb1a7c310c54d29536 2025-09-07T07:38:46.5856767Z * [new tag] trunk/6b1900c22f1a07b9519346898d4c71d8a2b0f12f -> trunk/6b1900c22f1a07b9519346898d4c71d8a2b0f12f 2025-09-07T07:38:46.5857229Z * [new tag] trunk/6b8b3ac4403f771bd4a8f9a45d93347304148774 -> trunk/6b8b3ac4403f771bd4a8f9a45d93347304148774 2025-09-07T07:38:46.5857687Z * [new tag] trunk/6f7608d603834d6068b2e7a5d59bec3973b6bb1b -> trunk/6f7608d603834d6068b2e7a5d59bec3973b6bb1b 2025-09-07T07:38:46.5858160Z * [new tag] trunk/70d36e047dfb3488fd6335016711a784d810ebda -> trunk/70d36e047dfb3488fd6335016711a784d810ebda 2025-09-07T07:38:46.5858608Z * [new tag] trunk/71992dd805ff9d6763f77214dfe8b0465e88c87b -> trunk/71992dd805ff9d6763f77214dfe8b0465e88c87b 2025-09-07T07:38:46.5859066Z * [new tag] trunk/734ce8eba9c69381f187359bf0fef1d71d84cd20 -> trunk/734ce8eba9c69381f187359bf0fef1d71d84cd20 2025-09-07T07:38:46.5859518Z * [new tag] trunk/73eb4511fb863a37944342b7e92aae706de603c8 -> trunk/73eb4511fb863a37944342b7e92aae706de603c8 2025-09-07T07:38:46.5859984Z * [new tag] trunk/75bc23cfc345bd4c05e7f97c416c4b3d2d1fa64b -> trunk/75bc23cfc345bd4c05e7f97c416c4b3d2d1fa64b 2025-09-07T07:38:46.5860444Z * [new tag] trunk/771f369448321a387f2018535bc8b8b6e5f12fab -> trunk/771f369448321a387f2018535bc8b8b6e5f12fab 2025-09-07T07:38:46.5860897Z * [new tag] trunk/789d4942127143f2adcb53612c058ce4c9a2cf20 -> trunk/789d4942127143f2adcb53612c058ce4c9a2cf20 2025-09-07T07:38:46.5861351Z * [new tag] trunk/791eff96c85678c950888f9da24650083ee673fe -> trunk/791eff96c85678c950888f9da24650083ee673fe 2025-09-07T07:38:46.5861810Z * [new tag] trunk/793fc12aff1f69fbbf9f4278182fb52bbe350fc9 -> trunk/793fc12aff1f69fbbf9f4278182fb52bbe350fc9 2025-09-07T07:38:46.5862274Z * [new tag] trunk/79fcd5247a9a129eee526a14df30bfc6a22b3f01 -> trunk/79fcd5247a9a129eee526a14df30bfc6a22b3f01 2025-09-07T07:38:46.5862764Z * [new tag] trunk/7f4ff79210eb06924f223ae3a1941ee0e2635348 -> trunk/7f4ff79210eb06924f223ae3a1941ee0e2635348 2025-09-07T07:38:46.5863226Z * [new tag] trunk/8076a185c85112be62be292eb47409c88a585b1c -> trunk/8076a185c85112be62be292eb47409c88a585b1c 2025-09-07T07:38:46.5863672Z * [new tag] trunk/80dd397f1979371a5583fa3d5c7352029522a78d -> trunk/80dd397f1979371a5583fa3d5c7352029522a78d 2025-09-07T07:38:46.5864110Z * [new tag] trunk/8171d6052ec12628eb67e0040839314056014429 -> trunk/8171d6052ec12628eb67e0040839314056014429 2025-09-07T07:38:46.5864566Z * [new tag] trunk/81aeefa657b7ccc26b275c50a9f33b2f056e8071 -> trunk/81aeefa657b7ccc26b275c50a9f33b2f056e8071 2025-09-07T07:38:46.5865019Z * [new tag] trunk/81b7b16618bda250ce55982894a83dc0805eb64c -> trunk/81b7b16618bda250ce55982894a83dc0805eb64c 2025-09-07T07:38:46.5865485Z * [new tag] trunk/827f0d405448de31f79d1089f7d7fceab2f87895 -> trunk/827f0d405448de31f79d1089f7d7fceab2f87895 2025-09-07T07:38:46.5865947Z * [new tag] trunk/82f63c8f6de63c30132a8ac299b6e8c2fd0d3fe8 -> trunk/82f63c8f6de63c30132a8ac299b6e8c2fd0d3fe8 2025-09-07T07:38:46.5866406Z * [new tag] trunk/850e1382a9c56bfde18af09d3e72352d775e9435 -> trunk/850e1382a9c56bfde18af09d3e72352d775e9435 2025-09-07T07:38:46.5866874Z * [new tag] trunk/8678d831c48e616b717bff50f2d03141d2e9f965 -> trunk/8678d831c48e616b717bff50f2d03141d2e9f965 2025-09-07T07:38:46.5867320Z * [new tag] trunk/869cbcc16e489a4f5a14a93d5779b0ea86061c60 -> trunk/869cbcc16e489a4f5a14a93d5779b0ea86061c60 2025-09-07T07:38:46.5867781Z * [new tag] trunk/8703debf669bc2238211bfd039f4ecdd8228b7f7 -> trunk/8703debf669bc2238211bfd039f4ecdd8228b7f7 2025-09-07T07:38:46.5868254Z * [new tag] trunk/874069fbe46e82da5cfa405e6c0deb12e89ff608 -> trunk/874069fbe46e82da5cfa405e6c0deb12e89ff608 2025-09-07T07:38:46.5868758Z * [new tag] trunk/8875d6e394da2fffd04f31b28bf258c94d4776a3 -> trunk/8875d6e394da2fffd04f31b28bf258c94d4776a3 2025-09-07T07:38:46.5869220Z * [new tag] trunk/88d94d17e8c5155451393afa6eb3bab48ab61c16 -> trunk/88d94d17e8c5155451393afa6eb3bab48ab61c16 2025-09-07T07:38:46.5869687Z * [new tag] trunk/890626632def7e0ef95a2d01e87a0e4627824a9f -> trunk/890626632def7e0ef95a2d01e87a0e4627824a9f 2025-09-07T07:38:46.5870142Z * [new tag] trunk/8975cda2520b7b1b5bc3b4d8213edf261fa82570 -> trunk/8975cda2520b7b1b5bc3b4d8213edf261fa82570 2025-09-07T07:38:46.5870601Z * [new tag] trunk/89d41d3f61d04f14730ec26f008a59bef6624610 -> trunk/89d41d3f61d04f14730ec26f008a59bef6624610 2025-09-07T07:38:46.5871067Z * [new tag] trunk/8bb213b6d599ef1273fe52f9b1f6d476056c3a41 -> trunk/8bb213b6d599ef1273fe52f9b1f6d476056c3a41 2025-09-07T07:38:46.5871537Z * [new tag] trunk/8e23a1227b5fb2e39afaa7d57c075a75b640a5af -> trunk/8e23a1227b5fb2e39afaa7d57c075a75b640a5af 2025-09-07T07:38:46.5872011Z * [new tag] trunk/8ec551bb354ab2b85fbbba9d461740a20366d248 -> trunk/8ec551bb354ab2b85fbbba9d461740a20366d248 2025-09-07T07:38:46.5872489Z * [new tag] trunk/8fd3c9ce919c8d5c645fd348bba517e948cbc29d -> trunk/8fd3c9ce919c8d5c645fd348bba517e948cbc29d 2025-09-07T07:38:46.5872941Z * [new tag] trunk/90f50f7e68e120d9574e6e3189e37b4280010ad9 -> trunk/90f50f7e68e120d9574e6e3189e37b4280010ad9 2025-09-07T07:38:46.5873401Z * [new tag] trunk/91f0bcf43fc0bc743350d491ac63b77e92054ac9 -> trunk/91f0bcf43fc0bc743350d491ac63b77e92054ac9 2025-09-07T07:38:46.5873863Z * [new tag] trunk/92576a594b8121f6b0b1b5a3ea16d08792fc68ab -> trunk/92576a594b8121f6b0b1b5a3ea16d08792fc68ab 2025-09-07T07:38:46.5874317Z * [new tag] trunk/92a43025e0baa1f2ce345f28d22913b518a1ab9d -> trunk/92a43025e0baa1f2ce345f28d22913b518a1ab9d 2025-09-07T07:38:46.5874829Z * [new tag] trunk/93fb23d6fae7c4e82c4239a1033e522088742634 -> trunk/93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:38:46.5875291Z * [new tag] trunk/9458d1ac3bd70c2af316a8ba95d2c6c9c1199c9c -> trunk/9458d1ac3bd70c2af316a8ba95d2c6c9c1199c9c 2025-09-07T07:38:46.5875746Z * [new tag] trunk/9480cdc0b61488c89a23c2f64f43b2dcedc8728e -> trunk/9480cdc0b61488c89a23c2f64f43b2dcedc8728e 2025-09-07T07:38:46.5876212Z * [new tag] trunk/9491d289b329e4ba4a9f5f5b1be7960671bb7840 -> trunk/9491d289b329e4ba4a9f5f5b1be7960671bb7840 2025-09-07T07:38:46.5876664Z * [new tag] trunk/9499c8761cd2067feb9877414e818f6fd00290f1 -> trunk/9499c8761cd2067feb9877414e818f6fd00290f1 2025-09-07T07:38:46.5877140Z * [new tag] trunk/95ee0bfea99d3d346d6502b91b497d2b35795504 -> trunk/95ee0bfea99d3d346d6502b91b497d2b35795504 2025-09-07T07:38:46.5877602Z * [new tag] trunk/98374612fc2febd686be20761e56bdc2424bc36a -> trunk/98374612fc2febd686be20761e56bdc2424bc36a 2025-09-07T07:38:46.5878062Z * [new tag] trunk/98efc9e93d8fc61eb53cb91378443617cb550500 -> trunk/98efc9e93d8fc61eb53cb91378443617cb550500 2025-09-07T07:38:46.5878519Z * [new tag] trunk/994f2a5dbcbdc915da39bf6f6ce4d1f5e74835c9 -> trunk/994f2a5dbcbdc915da39bf6f6ce4d1f5e74835c9 2025-09-07T07:38:46.5878988Z * [new tag] trunk/99f356fa58c8d726cef022d8710f5491291158f6 -> trunk/99f356fa58c8d726cef022d8710f5491291158f6 2025-09-07T07:38:46.5879457Z * [new tag] trunk/9a1c5c0a078b94d13ac5c1ae0d754d19fb73bf99 -> trunk/9a1c5c0a078b94d13ac5c1ae0d754d19fb73bf99 2025-09-07T07:38:46.5879920Z * [new tag] trunk/9a665ca3c472384e9d722bddba79e5a7680f1abd -> trunk/9a665ca3c472384e9d722bddba79e5a7680f1abd 2025-09-07T07:38:46.5880380Z * [new tag] trunk/9aedb3cd87b52160872173c177f61053d97bed57 -> trunk/9aedb3cd87b52160872173c177f61053d97bed57 2025-09-07T07:38:46.5880874Z * [new tag] trunk/9b81fe281da41f2421506339d26b027a468902f4 -> trunk/9b81fe281da41f2421506339d26b027a468902f4 2025-09-07T07:38:46.5881385Z * [new tag] trunk/9bdcee01f86e2969cff1140cdecfca13cb51816e -> trunk/9bdcee01f86e2969cff1140cdecfca13cb51816e 2025-09-07T07:38:46.5881861Z * [new tag] trunk/9c03d6be87eedc06e524e202e07a7e776551a839 -> trunk/9c03d6be87eedc06e524e202e07a7e776551a839 2025-09-07T07:38:46.5882327Z * [new tag] trunk/9c957723a0fedd9c637e63e023a613019e2cab60 -> trunk/9c957723a0fedd9c637e63e023a613019e2cab60 2025-09-07T07:38:46.5882792Z * [new tag] trunk/9e5247f51d81735e5f1e65e80588985fa93bccc5 -> trunk/9e5247f51d81735e5f1e65e80588985fa93bccc5 2025-09-07T07:38:46.5883263Z * [new tag] trunk/9eadb37cdd699f7e8e8177a5227bfeb16184ef26 -> trunk/9eadb37cdd699f7e8e8177a5227bfeb16184ef26 2025-09-07T07:38:46.5883744Z * [new tag] trunk/a00cdc1e4159db73c9ffb3f25e93e55877709a29 -> trunk/a00cdc1e4159db73c9ffb3f25e93e55877709a29 2025-09-07T07:38:46.5884222Z * [new tag] trunk/a02ee4a816d11380c6f564c1aba64d56af5ba705 -> trunk/a02ee4a816d11380c6f564c1aba64d56af5ba705 2025-09-07T07:38:46.5884689Z * [new tag] trunk/a3c7f77e50f900721817934120d60c2361b3c40d -> trunk/a3c7f77e50f900721817934120d60c2361b3c40d 2025-09-07T07:38:46.5885155Z * [new tag] trunk/a3d72b09ae12126a2b7d4a63a45ac100a882a802 -> trunk/a3d72b09ae12126a2b7d4a63a45ac100a882a802 2025-09-07T07:38:46.5885625Z * [new tag] trunk/a3e5466002791da609fcb069155d8ee347baee92 -> trunk/a3e5466002791da609fcb069155d8ee347baee92 2025-09-07T07:38:46.5886089Z * [new tag] trunk/a714437093ed196eee28f7de454cf4c41badc098 -> trunk/a714437093ed196eee28f7de454cf4c41badc098 2025-09-07T07:38:46.5886556Z * [new tag] trunk/a75e8cd27098f290de0b7439685d05ce02e91356 -> trunk/a75e8cd27098f290de0b7439685d05ce02e91356 2025-09-07T07:38:46.5887015Z * [new tag] trunk/a8d6943d36c1c2a5f90d3573460695bad4b623ae -> trunk/a8d6943d36c1c2a5f90d3573460695bad4b623ae 2025-09-07T07:38:46.5887544Z * [new tag] trunk/a918bbad6ab20649ff82eefb48417ecbe96bcb34 -> trunk/a918bbad6ab20649ff82eefb48417ecbe96bcb34 2025-09-07T07:38:46.5888021Z * [new tag] trunk/a99d8d39bc842d6ebc3e368b178e4884d24b056e -> trunk/a99d8d39bc842d6ebc3e368b178e4884d24b056e 2025-09-07T07:38:46.5888498Z * [new tag] trunk/aac1a50a191b4102d566c9c1ea22f06d6c2e3f02 -> trunk/aac1a50a191b4102d566c9c1ea22f06d6c2e3f02 2025-09-07T07:38:46.5888970Z * [new tag] trunk/aad96a202244c7d0d120c04ba8db593edd8c0f92 -> trunk/aad96a202244c7d0d120c04ba8db593edd8c0f92 2025-09-07T07:38:46.5889445Z * [new tag] trunk/ab643e4dbbaf7b663d4237514cbf01af9b11565c -> trunk/ab643e4dbbaf7b663d4237514cbf01af9b11565c 2025-09-07T07:38:46.5889922Z * [new tag] trunk/abc447174cd2cf8591edbc70a9f836f9a5779f47 -> trunk/abc447174cd2cf8591edbc70a9f836f9a5779f47 2025-09-07T07:38:46.5890400Z * [new tag] trunk/acece97c3a9dceb63194e314da93fdf37cf15a0d -> trunk/acece97c3a9dceb63194e314da93fdf37cf15a0d 2025-09-07T07:38:46.5890892Z * [new tag] trunk/adae7f66aacf3f248c3101b858cf98d5809119fa -> trunk/adae7f66aacf3f248c3101b858cf98d5809119fa 2025-09-07T07:38:46.5891379Z * [new tag] trunk/ae0edc133e61e3b16caf0b2ee0ff3f33ab72af4c -> trunk/ae0edc133e61e3b16caf0b2ee0ff3f33ab72af4c 2025-09-07T07:38:46.5891856Z * [new tag] trunk/aed33a8fcbd60b052d4559d261390c5797129c6d -> trunk/aed33a8fcbd60b052d4559d261390c5797129c6d 2025-09-07T07:38:46.5892315Z * [new tag] trunk/b04e922712080a3652e438d05e8bb74e0cd2d238 -> trunk/b04e922712080a3652e438d05e8bb74e0cd2d238 2025-09-07T07:38:46.5892780Z * [new tag] trunk/b0a3e58dd71c1a039ac0ef51e5bd8f704f632f6f -> trunk/b0a3e58dd71c1a039ac0ef51e5bd8f704f632f6f 2025-09-07T07:38:46.5893307Z * [new tag] trunk/b16d3f4c8c01d461c2f01064e9ca5fa2b33f5cf1 -> trunk/b16d3f4c8c01d461c2f01064e9ca5fa2b33f5cf1 2025-09-07T07:38:46.5893776Z * [new tag] trunk/b18bb6796f210a183e687d9d64984a5a9d13cf09 -> trunk/b18bb6796f210a183e687d9d64984a5a9d13cf09 2025-09-07T07:38:46.5894251Z * [new tag] trunk/b1bb98ddebdd3e41bf7987372409bdce96ae55de -> trunk/b1bb98ddebdd3e41bf7987372409bdce96ae55de 2025-09-07T07:38:46.5894721Z * [new tag] trunk/b2b4add0e754411372060e1d7b4057a66439172b -> trunk/b2b4add0e754411372060e1d7b4057a66439172b 2025-09-07T07:38:46.5895185Z * [new tag] trunk/b2c7b9ad2dc5a7c0b61febd307761bd5bc2f0f05 -> trunk/b2c7b9ad2dc5a7c0b61febd307761bd5bc2f0f05 2025-09-07T07:38:46.5895665Z * [new tag] trunk/b40d9432be44a6b5974ee62e7d19c3c61c5ece37 -> trunk/b40d9432be44a6b5974ee62e7d19c3c61c5ece37 2025-09-07T07:38:46.5896131Z * [new tag] trunk/b4ad38279b178b7bd14355123c1101e2e853e77b -> trunk/b4ad38279b178b7bd14355123c1101e2e853e77b 2025-09-07T07:38:46.5896604Z * [new tag] trunk/b67c41039835bd9b20b83cd6233e86baaa5f5dde -> trunk/b67c41039835bd9b20b83cd6233e86baaa5f5dde 2025-09-07T07:38:46.5897091Z * [new tag] trunk/b6d0a9ea9056ede4f7024dbf3bd6c43be3aff49c -> trunk/b6d0a9ea9056ede4f7024dbf3bd6c43be3aff49c 2025-09-07T07:38:46.5897574Z * [new tag] trunk/b7dad7dd49448c88d0751fa2e29c70afe985f734 -> trunk/b7dad7dd49448c88d0751fa2e29c70afe985f734 2025-09-07T07:38:46.5898046Z * [new tag] trunk/b7e207ca9f046ddd716076965a0cce403ba99052 -> trunk/b7e207ca9f046ddd716076965a0cce403ba99052 2025-09-07T07:38:46.5898513Z * [new tag] trunk/b919560c4a7010e2d89facee25586269a994746e -> trunk/b919560c4a7010e2d89facee25586269a994746e 2025-09-07T07:38:46.5898977Z * [new tag] trunk/b9ba612f7a968f7b27e121ca8f4d0a4d954f5354 -> trunk/b9ba612f7a968f7b27e121ca8f4d0a4d954f5354 2025-09-07T07:38:46.5899447Z * [new tag] trunk/ba7f546ccccb5e0b36d9070dc25f26a9647f89f8 -> trunk/ba7f546ccccb5e0b36d9070dc25f26a9647f89f8 2025-09-07T07:38:46.5899945Z * [new tag] trunk/bb950284c7e72905994bc25dd436c10e48088d85 -> trunk/bb950284c7e72905994bc25dd436c10e48088d85 2025-09-07T07:38:46.5900432Z * [new tag] trunk/bbedc71fd3267c639c38b4ec25eaa22f973d9c4d -> trunk/bbedc71fd3267c639c38b4ec25eaa22f973d9c4d 2025-09-07T07:38:46.5900901Z * [new tag] trunk/bc4db2c27fce6ff1648bdc5af31ec225d2a31f37 -> trunk/bc4db2c27fce6ff1648bdc5af31ec225d2a31f37 2025-09-07T07:38:46.5901370Z * [new tag] trunk/bc505977fb66677a09c31155c987330fbb18a865 -> trunk/bc505977fb66677a09c31155c987330fbb18a865 2025-09-07T07:38:46.5901842Z * [new tag] trunk/bd39e47feea7326afb5bbb67fcb1e69279239527 -> trunk/bd39e47feea7326afb5bbb67fcb1e69279239527 2025-09-07T07:38:46.5902317Z * [new tag] trunk/be5b03dde96638f25ffd732a4fed7e41b4cf40e1 -> trunk/be5b03dde96638f25ffd732a4fed7e41b4cf40e1 2025-09-07T07:38:46.5902791Z * [new tag] trunk/bffc7dd1f374d8408911cd22c6b3d6df39ded9b3 -> trunk/bffc7dd1f374d8408911cd22c6b3d6df39ded9b3 2025-09-07T07:38:46.5903278Z * [new tag] trunk/c024b1f5a18d5c5aee5cc2acdd4c52b24b93ffcf -> trunk/c024b1f5a18d5c5aee5cc2acdd4c52b24b93ffcf 2025-09-07T07:38:46.5903738Z * [new tag] trunk/c0983e6cc0acf71689e1851d12609e00b3f59371 -> trunk/c0983e6cc0acf71689e1851d12609e00b3f59371 2025-09-07T07:38:46.5904215Z * [new tag] trunk/c10195e723eeeedd099ed8b73eda7184ca618fad -> trunk/c10195e723eeeedd099ed8b73eda7184ca618fad 2025-09-07T07:38:46.5904692Z * [new tag] trunk/c157cf6488ade6a7ee2ce2d25b059e1335630a99 -> trunk/c157cf6488ade6a7ee2ce2d25b059e1335630a99 2025-09-07T07:38:46.5905152Z * [new tag] trunk/c2a30246172fd71d56529907ffd3c27b76b1f3a7 -> trunk/c2a30246172fd71d56529907ffd3c27b76b1f3a7 2025-09-07T07:38:46.5905603Z * [new tag] trunk/c32111149921b48bfef909293f1049e21619ed76 -> trunk/c32111149921b48bfef909293f1049e21619ed76 2025-09-07T07:38:46.5906293Z * [new tag] trunk/c37103234afc832dcad307e9016230810957c9d5 -> trunk/c37103234afc832dcad307e9016230810957c9d5 2025-09-07T07:38:46.5906745Z * [new tag] trunk/c3ceca2995cd35e1376c4b0704669bff1a81e836 -> trunk/c3ceca2995cd35e1376c4b0704669bff1a81e836 2025-09-07T07:38:46.5907219Z * [new tag] trunk/c3d54dea9febb1236d48d19e5d4876a63f2e20fd -> trunk/c3d54dea9febb1236d48d19e5d4876a63f2e20fd 2025-09-07T07:38:46.5907683Z * [new tag] trunk/c465b3d52c5687fe910d35a5c75341b77f821741 -> trunk/c465b3d52c5687fe910d35a5c75341b77f821741 2025-09-07T07:38:46.5908143Z * [new tag] trunk/c5b8a10be5e89396da916d1069ffcb7135f0372b -> trunk/c5b8a10be5e89396da916d1069ffcb7135f0372b 2025-09-07T07:38:46.5908604Z * [new tag] trunk/c7e41071a08f4045bc11ab60ec366d7357d56e30 -> trunk/c7e41071a08f4045bc11ab60ec366d7357d56e30 2025-09-07T07:38:46.5909084Z * [new tag] trunk/c98ddaca6d2e19ca37aff00c4ff0cda1e9a6ff65 -> trunk/c98ddaca6d2e19ca37aff00c4ff0cda1e9a6ff65 2025-09-07T07:38:46.5909553Z * [new tag] trunk/cb1e31362c7b53acf4ac95b9f8878064c184f03b -> trunk/cb1e31362c7b53acf4ac95b9f8878064c184f03b 2025-09-07T07:38:46.5910014Z * [new tag] trunk/cbfb005f7cce79974795b148e265f594f59477c8 -> trunk/cbfb005f7cce79974795b148e265f594f59477c8 2025-09-07T07:38:46.5910479Z * [new tag] trunk/cc5bdd12401bda835291d2f3cb297132ebdbf358 -> trunk/cc5bdd12401bda835291d2f3cb297132ebdbf358 2025-09-07T07:38:46.5910940Z * [new tag] trunk/cd529b686d54bbaa443f5b310140de48422d96c7 -> trunk/cd529b686d54bbaa443f5b310140de48422d96c7 2025-09-07T07:38:46.5911397Z * [new tag] trunk/cec0ff122815582af5302360aff03676558c5c87 -> trunk/cec0ff122815582af5302360aff03676558c5c87 2025-09-07T07:38:46.5911854Z * [new tag] trunk/d11720efdb563d02cf4f7d324311fb15a755268e -> trunk/d11720efdb563d02cf4f7d324311fb15a755268e 2025-09-07T07:38:46.5912318Z * [new tag] trunk/d1706d9128ae24d9048167e80d3fe5196d19035e -> trunk/d1706d9128ae24d9048167e80d3fe5196d19035e 2025-09-07T07:38:46.5912816Z * [new tag] trunk/d1a15abfdcaef138f2d9e93a9f46be44f30b766d -> trunk/d1a15abfdcaef138f2d9e93a9f46be44f30b766d 2025-09-07T07:38:46.5913294Z * [new tag] trunk/d232a95d4a79404ca05c1f52d37fde7339dcdf49 -> trunk/d232a95d4a79404ca05c1f52d37fde7339dcdf49 2025-09-07T07:38:46.5913758Z * [new tag] trunk/d2d4c8e9b2371c9aacfb771d9402ac7427b9778e -> trunk/d2d4c8e9b2371c9aacfb771d9402ac7427b9778e 2025-09-07T07:38:46.5914226Z * [new tag] trunk/d33840c542b387ab08ba49aa6c45aa9567fd9be7 -> trunk/d33840c542b387ab08ba49aa6c45aa9567fd9be7 2025-09-07T07:38:46.5914692Z * [new tag] trunk/d5643e8f3a648a99636bfa1f2a41d54bd3c0d0f1 -> trunk/d5643e8f3a648a99636bfa1f2a41d54bd3c0d0f1 2025-09-07T07:38:46.5915161Z * [new tag] trunk/d5b38410b5b6cf75c7a7389972777a6497926ee7 -> trunk/d5b38410b5b6cf75c7a7389972777a6497926ee7 2025-09-07T07:38:46.5915621Z * [new tag] trunk/d5e0f4202ba14632e4d14862ace096609e763462 -> trunk/d5e0f4202ba14632e4d14862ace096609e763462 2025-09-07T07:38:46.5916078Z * [new tag] trunk/d636c181f9140a7b59be10b36eae23039fc2bb72 -> trunk/d636c181f9140a7b59be10b36eae23039fc2bb72 2025-09-07T07:38:46.5916533Z * [new tag] trunk/d64718503728001a1e78168fd7f2d4ff23e57285 -> trunk/d64718503728001a1e78168fd7f2d4ff23e57285 2025-09-07T07:38:46.5916988Z * [new tag] trunk/d67c29ad22670320d676b02e394274af34e8e643 -> trunk/d67c29ad22670320d676b02e394274af34e8e643 2025-09-07T07:38:46.5917438Z * [new tag] trunk/d6b74568e2c98ce58ecc145b72ac66d4caf7ce95 -> trunk/d6b74568e2c98ce58ecc145b72ac66d4caf7ce95 2025-09-07T07:38:46.5917902Z * [new tag] trunk/d711f27845abd45007ccab6076649ebd896c2661 -> trunk/d711f27845abd45007ccab6076649ebd896c2661 2025-09-07T07:38:46.5918394Z * [new tag] trunk/d9d6dde0f42d4bcc8c97671ac50d5096c7e500ab -> trunk/d9d6dde0f42d4bcc8c97671ac50d5096c7e500ab 2025-09-07T07:38:46.5918869Z * [new tag] trunk/da4db4b33d1fdd046650cf19fdbac581a19bf2f9 -> trunk/da4db4b33d1fdd046650cf19fdbac581a19bf2f9 2025-09-07T07:38:46.5919353Z * [new tag] trunk/dac8a4b91c01c3bbc96f54e621b1ea4ffdbd29d1 -> trunk/dac8a4b91c01c3bbc96f54e621b1ea4ffdbd29d1 2025-09-07T07:38:46.5919825Z * [new tag] trunk/dbec08729fb9848bebed6048c63831b87170d061 -> trunk/dbec08729fb9848bebed6048c63831b87170d061 2025-09-07T07:38:46.5920277Z * [new tag] trunk/dcf385395d838f38c8dca25913578230dd43099a -> trunk/dcf385395d838f38c8dca25913578230dd43099a 2025-09-07T07:38:46.5920740Z * [new tag] trunk/dd2519abe83ec3c40d4797492434e41fe3b47e17 -> trunk/dd2519abe83ec3c40d4797492434e41fe3b47e17 2025-09-07T07:38:46.5921215Z * [new tag] trunk/dec72ea4b006dd0fbcaaaa106ad273d73807ab9d -> trunk/dec72ea4b006dd0fbcaaaa106ad273d73807ab9d 2025-09-07T07:38:46.5921687Z * [new tag] trunk/e0a62b266c021b910ce6dc02a6c9429210487717 -> trunk/e0a62b266c021b910ce6dc02a6c9429210487717 2025-09-07T07:38:46.5922157Z * [new tag] trunk/e19e02c84c9dcc408375e5cae3b0709c18b99228 -> trunk/e19e02c84c9dcc408375e5cae3b0709c18b99228 2025-09-07T07:38:46.5922621Z * [new tag] trunk/e304ea4e69d3a7deeb7e48c7450c214a4c953937 -> trunk/e304ea4e69d3a7deeb7e48c7450c214a4c953937 2025-09-07T07:38:46.5923078Z * [new tag] trunk/e3068cdb446adefb5a875616ba37a60235391439 -> trunk/e3068cdb446adefb5a875616ba37a60235391439 2025-09-07T07:38:46.5923538Z * [new tag] trunk/e381d4b0205d5f126c1de534f867ba776f7c3ee6 -> trunk/e381d4b0205d5f126c1de534f867ba776f7c3ee6 2025-09-07T07:38:46.5924009Z * [new tag] trunk/e4bd0ff4f8981b805df32ea5b3550621965ea4f2 -> trunk/e4bd0ff4f8981b805df32ea5b3550621965ea4f2 2025-09-07T07:38:46.5924479Z * [new tag] trunk/e532c9d4f1cdcbc1ea9628f55b9813e77847bdc7 -> trunk/e532c9d4f1cdcbc1ea9628f55b9813e77847bdc7 2025-09-07T07:38:46.5924989Z * [new tag] trunk/e92cd9415377403b6e90585e764639e2e0b5973b -> trunk/e92cd9415377403b6e90585e764639e2e0b5973b 2025-09-07T07:38:46.5925437Z * [new tag] trunk/e9481b6617b5576b099d8ca5798111592e9ad090 -> trunk/e9481b6617b5576b099d8ca5798111592e9ad090 2025-09-07T07:38:46.5925887Z * [new tag] trunk/ea1883dfd3e42defe37b11202b878bb76defa087 -> trunk/ea1883dfd3e42defe37b11202b878bb76defa087 2025-09-07T07:38:46.5926367Z * [new tag] trunk/eac3d6f04cfbbebe3d470dacd216da7d4b1f95a8 -> trunk/eac3d6f04cfbbebe3d470dacd216da7d4b1f95a8 2025-09-07T07:38:46.5926844Z * [new tag] trunk/eb18d32bda75189494d955aa001ade15f10333de -> trunk/eb18d32bda75189494d955aa001ade15f10333de 2025-09-07T07:38:46.5927318Z * [new tag] trunk/ef3be6726f7ff4b77c22db10cec5b686f9107ea9 -> trunk/ef3be6726f7ff4b77c22db10cec5b686f9107ea9 2025-09-07T07:38:46.5927797Z * [new tag] trunk/ef8aabd42422725026cb4dbf48aafa9efa226a04 -> trunk/ef8aabd42422725026cb4dbf48aafa9efa226a04 2025-09-07T07:38:46.5928268Z * [new tag] trunk/f00445b43eee57e20bb9316fa796ca23bf73373b -> trunk/f00445b43eee57e20bb9316fa796ca23bf73373b 2025-09-07T07:38:46.5928725Z * [new tag] trunk/f0c391102b754e3b145e8c59231d2df563487e37 -> trunk/f0c391102b754e3b145e8c59231d2df563487e37 2025-09-07T07:38:46.5929177Z * [new tag] trunk/f27985b7e796fb66a1b476284ba42d8cb360a751 -> trunk/f27985b7e796fb66a1b476284ba42d8cb360a751 2025-09-07T07:38:46.5929630Z * [new tag] trunk/f36f285953700f971552083a5da9d0ceacb63bbd -> trunk/f36f285953700f971552083a5da9d0ceacb63bbd 2025-09-07T07:38:46.5930095Z * [new tag] trunk/f3cebec39ebc110e1c8b06e741896585f7892dbb -> trunk/f3cebec39ebc110e1c8b06e741896585f7892dbb 2025-09-07T07:38:46.5930567Z * [new tag] trunk/f4c33cd44acac92c0b451a04da20ebe9370e5b0c -> trunk/f4c33cd44acac92c0b451a04da20ebe9370e5b0c 2025-09-07T07:38:46.5931063Z * [new tag] trunk/f612045ce105f008b2b675e2fc870163babeb2e8 -> trunk/f612045ce105f008b2b675e2fc870163babeb2e8 2025-09-07T07:38:46.5931527Z * [new tag] trunk/f8746b878dfc1e9639d42cbde832e9b9e792c86c -> trunk/f8746b878dfc1e9639d42cbde832e9b9e792c86c 2025-09-07T07:38:46.5931999Z * [new tag] trunk/f8ffa9194e26523e5f976d4a824d5cc58922727c -> trunk/f8ffa9194e26523e5f976d4a824d5cc58922727c 2025-09-07T07:38:46.5932463Z * [new tag] trunk/f981a7fa5230b98974291fdde32fe8488bc5d469 -> trunk/f981a7fa5230b98974291fdde32fe8488bc5d469 2025-09-07T07:38:46.5932942Z * [new tag] trunk/fbf3d2027daabbcb44d0af274b139be2a248a4f7 -> trunk/fbf3d2027daabbcb44d0af274b139be2a248a4f7 2025-09-07T07:38:46.5933426Z * [new tag] trunk/fca2601c9d628e1bd2d75c7318cd22c4e8c832aa -> trunk/fca2601c9d628e1bd2d75c7318cd22c4e8c832aa 2025-09-07T07:38:46.5933887Z * [new tag] trunk/fea20775ad96bdca972a1811d7d3372f368614ab -> trunk/fea20775ad96bdca972a1811d7d3372f368614ab 2025-09-07T07:38:46.5934359Z * [new tag] trunk/fefee081642f87419a21dc852f7167d4640443cd -> trunk/fefee081642f87419a21dc852f7167d4640443cd 2025-09-07T07:38:46.5934716Z * [new tag] v0.1.1 -> v0.1.1 2025-09-07T07:38:46.5934961Z * [new tag] v0.1.10 -> v0.1.10 2025-09-07T07:38:46.5935195Z * [new tag] v0.1.11 -> v0.1.11 2025-09-07T07:38:46.5935416Z * [new tag] v0.1.12 -> v0.1.12 2025-09-07T07:38:46.5935652Z * [new tag] v0.1.2 -> v0.1.2 2025-09-07T07:38:46.5935881Z * [new tag] v0.1.3 -> v0.1.3 2025-09-07T07:38:46.5936100Z * [new tag] v0.1.4 -> v0.1.4 2025-09-07T07:38:46.5936319Z * [new tag] v0.1.5 -> v0.1.5 2025-09-07T07:38:46.5936541Z * [new tag] v0.1.6 -> v0.1.6 2025-09-07T07:38:46.5936794Z * [new tag] v0.1.7 -> v0.1.7 2025-09-07T07:38:46.5937017Z * [new tag] v0.1.8 -> v0.1.8 2025-09-07T07:38:46.5937236Z * [new tag] v0.1.9 -> v0.1.9 2025-09-07T07:38:46.5937488Z * [new tag] v0.2.0 -> v0.2.0 2025-09-07T07:38:46.5937861Z * [new tag] v0.3.0 -> v0.3.0 2025-09-07T07:38:46.5938331Z * [new tag] v0.3.1 -> v0.3.1 2025-09-07T07:38:46.5938743Z * [new tag] v0.4.0 -> v0.4.0 2025-09-07T07:38:46.5939107Z * [new tag] v0.4.1 -> v0.4.1 2025-09-07T07:38:46.5939546Z * [new tag] v1.0.0 -> v1.0.0 2025-09-07T07:38:46.5940029Z * [new tag] v1.0.0a0 -> v1.0.0a0 2025-09-07T07:38:46.5940425Z * [new tag] v1.0.1 -> v1.0.1 2025-09-07T07:38:46.5940867Z * [new tag] v1.0rc0 -> v1.0rc0 2025-09-07T07:38:46.5941151Z * [new tag] v1.0rc1 -> v1.0rc1 2025-09-07T07:38:46.5941990Z * [new tag] v1.1.0 -> v1.1.0 2025-09-07T07:38:46.5942331Z * [new tag] v1.1.0a0 -> v1.1.0a0 2025-09-07T07:38:46.5942896Z * [new tag] v1.10.0 -> v1.10.0 2025-09-07T07:38:46.5943418Z * [new tag] v1.10.0-rc1 -> v1.10.0-rc1 2025-09-07T07:38:46.5943871Z * [new tag] v1.10.0-rc2 -> v1.10.0-rc2 2025-09-07T07:38:46.5944227Z * [new tag] v1.10.0-rc3 -> v1.10.0-rc3 2025-09-07T07:38:46.5944632Z * [new tag] v1.10.1 -> v1.10.1 2025-09-07T07:38:46.5945064Z * [new tag] v1.10.1-rc1 -> v1.10.1-rc1 2025-09-07T07:38:46.5945404Z * [new tag] v1.10.2 -> v1.10.2 2025-09-07T07:38:46.5945652Z * [new tag] v1.10.2-rc1 -> v1.10.2-rc1 2025-09-07T07:38:46.5946163Z * [new tag] v1.11.0 -> v1.11.0 2025-09-07T07:38:46.5946658Z * [new tag] v1.11.0-rc1 -> v1.11.0-rc1 2025-09-07T07:38:46.5947142Z * [new tag] v1.11.0-rc2 -> v1.11.0-rc2 2025-09-07T07:38:46.5947640Z * [new tag] v1.11.0-rc3 -> v1.11.0-rc3 2025-09-07T07:38:46.5948102Z * [new tag] v1.11.0-rc4 -> v1.11.0-rc4 2025-09-07T07:38:46.5948551Z * [new tag] v1.11.0-rc5 -> v1.11.0-rc5 2025-09-07T07:38:46.5948796Z * [new tag] v1.11.0-rc6 -> v1.11.0-rc6 2025-09-07T07:38:46.5949262Z * [new tag] v1.11.0-rc7 -> v1.11.0-rc7 2025-09-07T07:38:46.5949660Z * [new tag] v1.12.0 -> v1.12.0 2025-09-07T07:38:46.5950054Z * [new tag] v1.12.0-rc1 -> v1.12.0-rc1 2025-09-07T07:38:46.5950543Z * [new tag] v1.12.0-rc2 -> v1.12.0-rc2 2025-09-07T07:38:46.5951013Z * [new tag] v1.12.0-rc3 -> v1.12.0-rc3 2025-09-07T07:38:46.5951438Z * [new tag] v1.12.0-rc4 -> v1.12.0-rc4 2025-09-07T07:38:46.5951931Z * [new tag] v1.12.0-rc5 -> v1.12.0-rc5 2025-09-07T07:38:46.5952414Z * [new tag] v1.12.0-rc6 -> v1.12.0-rc6 2025-09-07T07:38:46.5952730Z * [new tag] v1.12.0-rc7 -> v1.12.0-rc7 2025-09-07T07:38:46.5952992Z * [new tag] v1.12.0-rc8 -> v1.12.0-rc8 2025-09-07T07:38:46.5953319Z * [new tag] v1.12.1 -> v1.12.1 2025-09-07T07:38:46.5953993Z * [new tag] v1.12.1-rc1 -> v1.12.1-rc1 2025-09-07T07:38:46.5954387Z * [new tag] v1.12.1-rc2 -> v1.12.1-rc2 2025-09-07T07:38:46.5954828Z * [new tag] v1.12.1-rc3 -> v1.12.1-rc3 2025-09-07T07:38:46.5955271Z * [new tag] v1.12.1-rc4 -> v1.12.1-rc4 2025-09-07T07:38:46.5955561Z * [new tag] v1.12.1-rc5 -> v1.12.1-rc5 2025-09-07T07:38:46.5956059Z * [new tag] v1.13.0 -> v1.13.0 2025-09-07T07:38:46.5956452Z * [new tag] v1.13.0-rc1 -> v1.13.0-rc1 2025-09-07T07:38:46.5956875Z * [new tag] v1.13.0-rc2 -> v1.13.0-rc2 2025-09-07T07:38:46.5957264Z * [new tag] v1.13.0-rc3 -> v1.13.0-rc3 2025-09-07T07:38:46.5957831Z * [new tag] v1.13.0-rc4 -> v1.13.0-rc4 2025-09-07T07:38:46.5958170Z * [new tag] v1.13.0-rc5 -> v1.13.0-rc5 2025-09-07T07:38:46.5958437Z * [new tag] v1.13.0-rc6 -> v1.13.0-rc6 2025-09-07T07:38:46.5959019Z * [new tag] v1.13.1 -> v1.13.1 2025-09-07T07:38:46.5959312Z * [new tag] v1.13.1-rc1 -> v1.13.1-rc1 2025-09-07T07:38:46.5959749Z * [new tag] v1.2.0 -> v1.2.0 2025-09-07T07:38:46.5960159Z * [new tag] v1.2.0a0 -> v1.2.0a0 2025-09-07T07:38:46.5960565Z * [new tag] v1.3.0 -> v1.3.0 2025-09-07T07:38:46.5960997Z * [new tag] v1.3.0a0 -> v1.3.0a0 2025-09-07T07:38:46.5961334Z * [new tag] v1.3.1 -> v1.3.1 2025-09-07T07:38:46.5961723Z * [new tag] v1.4.0 -> v1.4.0 2025-09-07T07:38:46.5962138Z * [new tag] v1.4.0a0 -> v1.4.0a0 2025-09-07T07:38:46.5962512Z * [new tag] v1.4.1 -> v1.4.1 2025-09-07T07:38:46.5962942Z * [new tag] v1.5.0 -> v1.5.0 2025-09-07T07:38:46.5963440Z * [new tag] v1.5.0-rc1 -> v1.5.0-rc1 2025-09-07T07:38:46.5963869Z * [new tag] v1.5.0-rc2 -> v1.5.0-rc2 2025-09-07T07:38:46.5964400Z * [new tag] v1.5.0-rc3 -> v1.5.0-rc3 2025-09-07T07:38:46.5964773Z * [new tag] v1.5.0-rc4 -> v1.5.0-rc4 2025-09-07T07:38:46.5965031Z * [new tag] v1.5.0-rc5 -> v1.5.0-rc5 2025-09-07T07:38:46.5965515Z * [new tag] v1.5.1 -> v1.5.1 2025-09-07T07:38:46.5965809Z * [new tag] v1.5.1-rc1 -> v1.5.1-rc1 2025-09-07T07:38:46.5966136Z * [new tag] v1.6.0 -> v1.6.0 2025-09-07T07:38:46.5966577Z * [new tag] v1.6.0-rc1 -> v1.6.0-rc1 2025-09-07T07:38:46.5967138Z * [new tag] v1.6.0-rc2 -> v1.6.0-rc2 2025-09-07T07:38:46.5967589Z * [new tag] v1.6.0-rc3 -> v1.6.0-rc3 2025-09-07T07:38:46.5968007Z * [new tag] v1.6.0-rc4 -> v1.6.0-rc4 2025-09-07T07:38:46.5968485Z * [new tag] v1.6.0-rc5 -> v1.6.0-rc5 2025-09-07T07:38:46.5969251Z * [new tag] v1.6.0-rc6 -> v1.6.0-rc6 2025-09-07T07:38:46.5969482Z * [new tag] v1.6.0-rc7 -> v1.6.0-rc7 2025-09-07T07:38:46.5969933Z * [new tag] v1.7.0 -> v1.7.0 2025-09-07T07:38:46.5970410Z * [new tag] v1.7.0-rc1 -> v1.7.0-rc1 2025-09-07T07:38:46.5970925Z * [new tag] v1.7.0-rc2 -> v1.7.0-rc2 2025-09-07T07:38:46.5971455Z * [new tag] v1.7.0-rc3 -> v1.7.0-rc3 2025-09-07T07:38:46.5971850Z * [new tag] v1.7.0-rc4 -> v1.7.0-rc4 2025-09-07T07:38:46.5972270Z * [new tag] v1.7.1 -> v1.7.1 2025-09-07T07:38:46.5972796Z * [new tag] v1.7.1-rc1 -> v1.7.1-rc1 2025-09-07T07:38:46.5973258Z * [new tag] v1.7.1-rc2 -> v1.7.1-rc2 2025-09-07T07:38:46.5973538Z * [new tag] v1.7.1-rc3 -> v1.7.1-rc3 2025-09-07T07:38:46.5973998Z * [new tag] v1.8.0 -> v1.8.0 2025-09-07T07:38:46.5974279Z * [new tag] v1.8.0-rc1 -> v1.8.0-rc1 2025-09-07T07:38:46.5974757Z * [new tag] v1.8.0-rc2 -> v1.8.0-rc2 2025-09-07T07:38:46.5975183Z * [new tag] v1.8.0-rc3 -> v1.8.0-rc3 2025-09-07T07:38:46.5975610Z * [new tag] v1.8.0-rc4 -> v1.8.0-rc4 2025-09-07T07:38:46.5975950Z * [new tag] v1.8.0-rc5 -> v1.8.0-rc5 2025-09-07T07:38:46.5976225Z * [new tag] v1.8.1 -> v1.8.1 2025-09-07T07:38:46.5976680Z * [new tag] v1.8.1-rc1 -> v1.8.1-rc1 2025-09-07T07:38:46.5976991Z * [new tag] v1.8.1-rc2 -> v1.8.1-rc2 2025-09-07T07:38:46.5977297Z * [new tag] v1.8.1-rc3 -> v1.8.1-rc3 2025-09-07T07:38:46.5978216Z * [new tag] v1.8.2 -> v1.8.2 2025-09-07T07:38:46.5978509Z * [new tag] v1.8.2-rc1 -> v1.8.2-rc1 2025-09-07T07:38:46.5978934Z * [new tag] v1.9.0 -> v1.9.0 2025-09-07T07:38:46.5979347Z * [new tag] v1.9.0-rc1 -> v1.9.0-rc1 2025-09-07T07:38:46.5979894Z * [new tag] v1.9.0-rc2 -> v1.9.0-rc2 2025-09-07T07:38:46.5980364Z * [new tag] v1.9.0-rc3 -> v1.9.0-rc3 2025-09-07T07:38:46.5980676Z * [new tag] v1.9.0-rc4 -> v1.9.0-rc4 2025-09-07T07:38:46.5981280Z * [new tag] v1.9.1 -> v1.9.1 2025-09-07T07:38:46.5981849Z * [new tag] v1.9.1-rc1 -> v1.9.1-rc1 2025-09-07T07:38:46.5982143Z * [new tag] v1.9.1-rc2 -> v1.9.1-rc2 2025-09-07T07:38:46.5982630Z * [new tag] v2.0.0 -> v2.0.0 2025-09-07T07:38:46.5983040Z * [new tag] v2.0.0-rc1 -> v2.0.0-rc1 2025-09-07T07:38:46.5983570Z * [new tag] v2.0.0-rc2 -> v2.0.0-rc2 2025-09-07T07:38:46.5984259Z * [new tag] v2.0.0-rc3 -> v2.0.0-rc3 2025-09-07T07:38:46.5984662Z * [new tag] v2.0.0-rc4 -> v2.0.0-rc4 2025-09-07T07:38:46.5985079Z * [new tag] v2.0.0-rc5 -> v2.0.0-rc5 2025-09-07T07:38:46.5985437Z * [new tag] v2.0.0-rc6 -> v2.0.0-rc6 2025-09-07T07:38:46.5985888Z * [new tag] v2.0.1 -> v2.0.1 2025-09-07T07:38:46.5986363Z * [new tag] v2.0.1-rc1 -> v2.0.1-rc1 2025-09-07T07:38:46.5986654Z * [new tag] v2.0.1-rc2 -> v2.0.1-rc2 2025-09-07T07:38:46.5987095Z * [new tag] v2.0.1-rc3 -> v2.0.1-rc3 2025-09-07T07:38:46.5987413Z * [new tag] v2.0.1-rc4 -> v2.0.1-rc4 2025-09-07T07:38:46.5988285Z * [new tag] v2.1.0 -> v2.1.0 2025-09-07T07:38:46.5988658Z * [new tag] v2.1.0-rc1 -> v2.1.0-rc1 2025-09-07T07:38:46.5989091Z * [new tag] v2.1.0-rc2 -> v2.1.0-rc2 2025-09-07T07:38:46.5989585Z * [new tag] v2.1.0-rc3 -> v2.1.0-rc3 2025-09-07T07:38:46.5990141Z * [new tag] v2.1.0-rc4 -> v2.1.0-rc4 2025-09-07T07:38:46.5990564Z * [new tag] v2.1.0-rc5 -> v2.1.0-rc5 2025-09-07T07:38:46.5990896Z * [new tag] v2.1.0-rc6 -> v2.1.0-rc6 2025-09-07T07:38:46.5991320Z * [new tag] v2.1.1 -> v2.1.1 2025-09-07T07:38:46.5991765Z * [new tag] v2.1.1-rc1 -> v2.1.1-rc1 2025-09-07T07:38:46.5992184Z * [new tag] v2.1.1-rc2 -> v2.1.1-rc2 2025-09-07T07:38:46.5992734Z * [new tag] v2.1.1-rc3 -> v2.1.1-rc3 2025-09-07T07:38:46.5993198Z * [new tag] v2.1.1-rc4 -> v2.1.1-rc4 2025-09-07T07:38:46.5993673Z * [new tag] v2.1.1-rc5 -> v2.1.1-rc5 2025-09-07T07:38:46.5993956Z * [new tag] v2.1.1-rc6 -> v2.1.1-rc6 2025-09-07T07:38:46.5994392Z * [new tag] v2.1.2 -> v2.1.2 2025-09-07T07:38:46.5994881Z * [new tag] v2.1.2-rc1 -> v2.1.2-rc1 2025-09-07T07:38:46.5995358Z * [new tag] v2.1.2-rc2 -> v2.1.2-rc2 2025-09-07T07:38:46.5995757Z * [new tag] v2.1.2-rc3 -> v2.1.2-rc3 2025-09-07T07:38:46.5996187Z * [new tag] v2.2.0 -> v2.2.0 2025-09-07T07:38:46.5996602Z * [new tag] v2.2.0-rc1 -> v2.2.0-rc1 2025-09-07T07:38:46.5997019Z * [new tag] v2.2.0-rc2 -> v2.2.0-rc2 2025-09-07T07:38:46.5997982Z * [new tag] v2.2.0-rc3 -> v2.2.0-rc3 2025-09-07T07:38:46.5998344Z * [new tag] v2.2.0-rc4 -> v2.2.0-rc4 2025-09-07T07:38:46.5998734Z * [new tag] v2.2.0-rc5 -> v2.2.0-rc5 2025-09-07T07:38:46.5999177Z * [new tag] v2.2.0-rc6 -> v2.2.0-rc6 2025-09-07T07:38:46.5999502Z * [new tag] v2.2.0-rc7 -> v2.2.0-rc7 2025-09-07T07:38:46.5999852Z * [new tag] v2.2.0-rc8 -> v2.2.0-rc8 2025-09-07T07:38:46.6000318Z * [new tag] v2.2.1 -> v2.2.1 2025-09-07T07:38:46.6000793Z * [new tag] v2.2.1-rc1 -> v2.2.1-rc1 2025-09-07T07:38:46.6001118Z * [new tag] v2.2.1-rc2 -> v2.2.1-rc2 2025-09-07T07:38:46.6001426Z * [new tag] v2.2.1-rc3 -> v2.2.1-rc3 2025-09-07T07:38:46.6001730Z * [new tag] v2.2.2 -> v2.2.2 2025-09-07T07:38:46.6002228Z * [new tag] v2.2.2-rc1 -> v2.2.2-rc1 2025-09-07T07:38:46.6002542Z * [new tag] v2.2.2-rc2 -> v2.2.2-rc2 2025-09-07T07:38:46.6002795Z * [new tag] v2.2.2-rc3 -> v2.2.2-rc3 2025-09-07T07:38:46.6003308Z * [new tag] v2.3.0 -> v2.3.0 2025-09-07T07:38:46.6003728Z * [new tag] v2.3.0-rc1 -> v2.3.0-rc1 2025-09-07T07:38:46.6004222Z * [new tag] v2.3.0-rc10 -> v2.3.0-rc10 2025-09-07T07:38:46.6004736Z * [new tag] v2.3.0-rc11 -> v2.3.0-rc11 2025-09-07T07:38:46.6005018Z * [new tag] v2.3.0-rc12 -> v2.3.0-rc12 2025-09-07T07:38:46.6005481Z * [new tag] v2.3.0-rc2 -> v2.3.0-rc2 2025-09-07T07:38:46.6005947Z * [new tag] v2.3.0-rc3 -> v2.3.0-rc3 2025-09-07T07:38:46.6006410Z * [new tag] v2.3.0-rc4 -> v2.3.0-rc4 2025-09-07T07:38:46.6006824Z * [new tag] v2.3.0-rc5 -> v2.3.0-rc5 2025-09-07T07:38:46.6007128Z * [new tag] v2.3.0-rc6 -> v2.3.0-rc6 2025-09-07T07:38:46.6007630Z * [new tag] v2.3.0-rc7 -> v2.3.0-rc7 2025-09-07T07:38:46.6008047Z * [new tag] v2.3.0-rc8 -> v2.3.0-rc8 2025-09-07T07:38:46.6008333Z * [new tag] v2.3.0-rc9 -> v2.3.0-rc9 2025-09-07T07:38:46.6008657Z * [new tag] v2.3.1 -> v2.3.1 2025-09-07T07:38:46.6009179Z * [new tag] v2.3.1-rc1 -> v2.3.1-rc1 2025-09-07T07:38:46.6009744Z * [new tag] v2.3.1-rc2 -> v2.3.1-rc2 2025-09-07T07:38:46.6027646Z * [new tag] v2.3.1-rc3 -> v2.3.1-rc3 2025-09-07T07:38:46.6027973Z * [new tag] v2.4.0 -> v2.4.0 2025-09-07T07:38:46.6028224Z * [new tag] v2.4.0-rc1 -> v2.4.0-rc1 2025-09-07T07:38:46.6028456Z * [new tag] v2.4.0-rc2 -> v2.4.0-rc2 2025-09-07T07:38:46.6028682Z * [new tag] v2.4.0-rc3 -> v2.4.0-rc3 2025-09-07T07:38:46.6028916Z * [new tag] v2.4.0-rc4 -> v2.4.0-rc4 2025-09-07T07:38:46.6029142Z * [new tag] v2.4.0-rc5 -> v2.4.0-rc5 2025-09-07T07:38:46.6029368Z * [new tag] v2.4.0-rc6 -> v2.4.0-rc6 2025-09-07T07:38:46.6029593Z * [new tag] v2.4.0-rc7 -> v2.4.0-rc7 2025-09-07T07:38:46.6029818Z * [new tag] v2.4.0-rc8 -> v2.4.0-rc8 2025-09-07T07:38:46.6030047Z * [new tag] v2.4.0-rc9 -> v2.4.0-rc9 2025-09-07T07:38:46.6030136Z * [new tag] v2.4.1 -> v2.4.1 2025-09-07T07:38:46.6030227Z * [new tag] v2.4.1-rc1 -> v2.4.1-rc1 2025-09-07T07:38:46.6030313Z * [new tag] v2.4.1-rc2 -> v2.4.1-rc2 2025-09-07T07:38:46.6030488Z * [new tag] v2.4.1-rc3 -> v2.4.1-rc3 2025-09-07T07:38:46.6030585Z * [new tag] v2.5.0 -> v2.5.0 2025-09-07T07:38:46.6030672Z * [new tag] v2.5.0-rc1 -> v2.5.0-rc1 2025-09-07T07:38:46.6030775Z * [new tag] v2.5.0-rc10 -> v2.5.0-rc10 2025-09-07T07:38:46.6030857Z * [new tag] v2.5.0-rc2 -> v2.5.0-rc2 2025-09-07T07:38:46.6030949Z * [new tag] v2.5.0-rc3 -> v2.5.0-rc3 2025-09-07T07:38:46.6031033Z * [new tag] v2.5.0-rc4 -> v2.5.0-rc4 2025-09-07T07:38:46.6031117Z * [new tag] v2.5.0-rc5 -> v2.5.0-rc5 2025-09-07T07:38:46.6031212Z * [new tag] v2.5.0-rc6 -> v2.5.0-rc6 2025-09-07T07:38:46.6031297Z * [new tag] v2.5.0-rc7 -> v2.5.0-rc7 2025-09-07T07:38:46.6031395Z * [new tag] v2.5.0-rc8 -> v2.5.0-rc8 2025-09-07T07:38:46.6031482Z * [new tag] v2.5.0-rc9 -> v2.5.0-rc9 2025-09-07T07:38:46.6031567Z * [new tag] v2.5.1 -> v2.5.1 2025-09-07T07:38:46.6031661Z * [new tag] v2.5.1-rc1 -> v2.5.1-rc1 2025-09-07T07:38:46.6031741Z * [new tag] v2.6.0 -> v2.6.0 2025-09-07T07:38:46.6031835Z * [new tag] v2.6.0-rc1 -> v2.6.0-rc1 2025-09-07T07:38:46.6031922Z * [new tag] v2.6.0-rc2 -> v2.6.0-rc2 2025-09-07T07:38:46.6032007Z * [new tag] v2.6.0-rc3 -> v2.6.0-rc3 2025-09-07T07:38:46.6032102Z * [new tag] v2.6.0-rc4 -> v2.6.0-rc4 2025-09-07T07:38:46.6032184Z * [new tag] v2.6.0-rc5 -> v2.6.0-rc5 2025-09-07T07:38:46.6032277Z * [new tag] v2.6.0-rc6 -> v2.6.0-rc6 2025-09-07T07:38:46.6032359Z * [new tag] v2.6.0-rc7 -> v2.6.0-rc7 2025-09-07T07:38:46.6032511Z * [new tag] v2.6.0-rc8 -> v2.6.0-rc8 2025-09-07T07:38:46.6032597Z * [new tag] v2.6.0-rc9 -> v2.6.0-rc9 2025-09-07T07:38:46.6032690Z * [new tag] v2.7.0 -> v2.7.0 2025-09-07T07:38:46.6032779Z * [new tag] v2.7.0-rc1 -> v2.7.0-rc1 2025-09-07T07:38:46.6032869Z * [new tag] v2.7.0-rc10 -> v2.7.0-rc10 2025-09-07T07:38:46.6032961Z * [new tag] v2.7.0-rc2 -> v2.7.0-rc2 2025-09-07T07:38:46.6033043Z * [new tag] v2.7.0-rc3 -> v2.7.0-rc3 2025-09-07T07:38:46.6033138Z * [new tag] v2.7.0-rc4 -> v2.7.0-rc4 2025-09-07T07:38:46.6033225Z * [new tag] v2.7.0-rc5 -> v2.7.0-rc5 2025-09-07T07:38:46.6033309Z * [new tag] v2.7.0-rc6 -> v2.7.0-rc6 2025-09-07T07:38:46.6033409Z * [new tag] v2.7.0-rc7 -> v2.7.0-rc7 2025-09-07T07:38:46.6033493Z * [new tag] v2.7.0-rc8 -> v2.7.0-rc8 2025-09-07T07:38:46.6033595Z * [new tag] v2.7.0-rc9 -> v2.7.0-rc9 2025-09-07T07:38:46.6033679Z * [new tag] v2.7.1 -> v2.7.1 2025-09-07T07:38:46.6034059Z * [new tag] v2.7.1-rc1 -> v2.7.1-rc1 2025-09-07T07:38:46.6034392Z * [new tag] v2.7.1-rc2 -> v2.7.1-rc2 2025-09-07T07:38:46.6034934Z * [new tag] v2.7.1-rc3 -> v2.7.1-rc3 2025-09-07T07:38:46.6035493Z * [new tag] v2.7.1-rc4 -> v2.7.1-rc4 2025-09-07T07:38:46.6035912Z * [new tag] v2.7.1-rc5 -> v2.7.1-rc5 2025-09-07T07:38:46.6036310Z * [new tag] v2.8.0 -> v2.8.0 2025-09-07T07:38:46.6036739Z * [new tag] v2.8.0-rc1 -> v2.8.0-rc1 2025-09-07T07:38:46.6037476Z * [new tag] v2.8.0-rc2 -> v2.8.0-rc2 2025-09-07T07:38:46.6038024Z * [new tag] v2.8.0-rc3 -> v2.8.0-rc3 2025-09-07T07:38:46.6038471Z * [new tag] v2.8.0-rc4 -> v2.8.0-rc4 2025-09-07T07:38:46.6039037Z * [new tag] v2.8.0-rc5 -> v2.8.0-rc5 2025-09-07T07:38:46.6039577Z * [new tag] v2.8.0-rc6 -> v2.8.0-rc6 2025-09-07T07:38:46.6039980Z * [new tag] v2.8.0-rc7 -> v2.8.0-rc7 2025-09-07T07:38:46.6040509Z * [new tag] v2.8.0-rc8 -> v2.8.0-rc8 2025-09-07T07:38:46.6040882Z * [new tag] whc_flight_1 -> whc_flight_1 2025-09-07T07:38:46.6041433Z * [new tag] whc_flight_2 -> whc_flight_2 2025-09-07T07:38:46.6041813Z * [new tag] whc_flight_4 -> whc_flight_4 2025-09-07T07:38:46.6445698Z [command]/usr/bin/git rev-parse --verify --quiet 93fb23d6fae7c4e82c4239a1033e522088742634^{object} 2025-09-07T07:38:46.6466813Z 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:38:46.6469269Z ##[endgroup] 2025-09-07T07:38:46.6469497Z ##[group]Determining the checkout info 2025-09-07T07:38:46.6470249Z ##[endgroup] 2025-09-07T07:38:46.6472933Z [command]/usr/bin/git sparse-checkout disable 2025-09-07T07:38:46.6503907Z [command]/usr/bin/git config --local --unset-all extensions.worktreeConfig 2025-09-07T07:38:46.6524376Z ##[group]Checking out the ref 2025-09-07T07:38:46.6527058Z [command]/usr/bin/git checkout --progress --force 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:38:47.5885677Z Note: switching to '93fb23d6fae7c4e82c4239a1033e522088742634'. 2025-09-07T07:38:47.5885908Z 2025-09-07T07:38:47.5886059Z You are in 'detached HEAD' state. You can look around, make experimental 2025-09-07T07:38:47.5886600Z changes and commit them, and you can discard any commits you make in this 2025-09-07T07:38:47.5886934Z state without impacting any branches by switching back to a branch. 2025-09-07T07:38:47.5887115Z 2025-09-07T07:38:47.5887238Z If you want to create a new branch to retain commits you create, you may 2025-09-07T07:38:47.5887531Z do so (now or later) by using -c with the switch command. Example: 2025-09-07T07:38:47.5887692Z 2025-09-07T07:38:47.5887785Z git switch -c 2025-09-07T07:38:47.5887907Z 2025-09-07T07:38:47.5887983Z Or undo this operation with: 2025-09-07T07:38:47.5888103Z 2025-09-07T07:38:47.5888165Z git switch - 2025-09-07T07:38:47.5888259Z 2025-09-07T07:38:47.5888401Z Turn off this advice by setting config variable advice.detachedHead to false 2025-09-07T07:38:47.5888587Z 2025-09-07T07:38:47.5888706Z HEAD is now at 93fb23d6fae Build vLLM nightly wheels (#162000) 2025-09-07T07:38:47.5933325Z ##[endgroup] 2025-09-07T07:38:47.5933616Z ##[group]Setting up auth for fetching submodules 2025-09-07T07:38:47.5939302Z [command]/usr/bin/git config --global http.https://github.com/.extraheader AUTHORIZATION: basic *** 2025-09-07T07:38:47.5973979Z [command]/usr/bin/git config --global --unset-all url.https://github.com/.insteadOf 2025-09-07T07:38:47.5994708Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf git@github.com: 2025-09-07T07:38:47.6015381Z [command]/usr/bin/git config --global --add url.https://github.com/.insteadOf org-21003710@github.com: 2025-09-07T07:38:47.6034168Z ##[endgroup] 2025-09-07T07:38:47.6034420Z ##[group]Fetching submodules 2025-09-07T07:38:47.6036818Z [command]/usr/bin/git submodule sync --recursive 2025-09-07T07:38:47.6292788Z [command]/usr/bin/git -c protocol.version=2 submodule update --init --force --recursive 2025-09-07T07:38:47.6544707Z Submodule 'android/libs/fbjni' (https://github.com/facebookincubator/fbjni.git) registered for path 'android/libs/fbjni' 2025-09-07T07:38:47.6860967Z Submodule 'third_party/NNPACK_deps/FP16' (https://github.com/Maratyszcza/FP16.git) registered for path 'third_party/FP16' 2025-09-07T07:38:47.6861766Z Submodule 'third_party/NNPACK_deps/FXdiv' (https://github.com/Maratyszcza/FXdiv.git) registered for path 'third_party/FXdiv' 2025-09-07T07:38:47.6863222Z Submodule 'third_party/NNPACK' (https://github.com/Maratyszcza/NNPACK.git) registered for path 'third_party/NNPACK' 2025-09-07T07:38:47.6864657Z Submodule 'third_party/NVTX' (https://github.com/NVIDIA/NVTX.git) registered for path 'third_party/NVTX' 2025-09-07T07:38:47.6867949Z Submodule 'third_party/VulkanMemoryAllocator' (https://github.com/GPUOpen-LibrariesAndSDKs/VulkanMemoryAllocator.git) registered for path 'third_party/VulkanMemoryAllocator' 2025-09-07T07:38:47.6879961Z Submodule 'third_party/XNNPACK' (https://github.com/google/XNNPACK.git) registered for path 'third_party/XNNPACK' 2025-09-07T07:38:47.6881671Z Submodule 'third_party/aiter' (https://github.com/ROCm/aiter.git) registered for path 'third_party/aiter' 2025-09-07T07:38:47.6883568Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/benchmark' 2025-09-07T07:38:47.6885588Z Submodule 'third_party/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/composable_kernel' 2025-09-07T07:38:47.6887587Z Submodule 'third_party/cpp-httplib' (https://github.com/yhirose/cpp-httplib.git) registered for path 'third_party/cpp-httplib' 2025-09-07T07:38:47.6900455Z Submodule 'third_party/cpuinfo' (https://github.com/pytorch/cpuinfo.git) registered for path 'third_party/cpuinfo' 2025-09-07T07:38:47.6902366Z Submodule 'third_party/cudnn_frontend' (https://github.com/NVIDIA/cudnn-frontend.git) registered for path 'third_party/cudnn_frontend' 2025-09-07T07:38:47.6904171Z Submodule 'third_party/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/cutlass' 2025-09-07T07:38:47.6906193Z Submodule 'third_party/fbgemm' (https://github.com/pytorch/fbgemm) registered for path 'third_party/fbgemm' 2025-09-07T07:38:47.6908407Z Submodule 'third_party/flash-attention' (https://github.com/Dao-AILab/flash-attention.git) registered for path 'third_party/flash-attention' 2025-09-07T07:38:47.6921226Z Submodule 'third_party/flatbuffers' (https://github.com/google/flatbuffers.git) registered for path 'third_party/flatbuffers' 2025-09-07T07:38:47.6923192Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/fmt' 2025-09-07T07:38:47.6926375Z Submodule 'third_party/gemmlowp/gemmlowp' (https://github.com/google/gemmlowp.git) registered for path 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:38:47.6928547Z Submodule 'third_party/gloo' (https://github.com/pytorch/gloo) registered for path 'third_party/gloo' 2025-09-07T07:38:47.6931129Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/googletest' 2025-09-07T07:38:47.6945280Z Submodule 'third_party/ideep' (https://github.com/intel/ideep) registered for path 'third_party/ideep' 2025-09-07T07:38:47.6947578Z Submodule 'third_party/ittapi' (https://github.com/intel/ittapi.git) registered for path 'third_party/ittapi' 2025-09-07T07:38:47.6949844Z Submodule 'third_party/kineto' (https://github.com/pytorch/kineto) registered for path 'third_party/kineto' 2025-09-07T07:38:47.6952213Z Submodule 'third_party/kleidiai' (https://github.com/ARM-software/kleidiai.git) registered for path 'third_party/kleidiai' 2025-09-07T07:38:47.6954542Z Submodule 'third_party/mimalloc' (https://github.com/microsoft/mimalloc.git) registered for path 'third_party/mimalloc' 2025-09-07T07:38:47.6967706Z Submodule 'third_party/nlohmann' (https://github.com/nlohmann/json.git) registered for path 'third_party/nlohmann' 2025-09-07T07:38:47.6970127Z Submodule 'third_party/onnx' (https://github.com/onnx/onnx.git) registered for path 'third_party/onnx' 2025-09-07T07:38:47.6972831Z Submodule 'third_party/opentelemetry-cpp' (https://github.com/open-telemetry/opentelemetry-cpp.git) registered for path 'third_party/opentelemetry-cpp' 2025-09-07T07:38:47.6975254Z Submodule 'third_party/pocketfft' (https://github.com/mreineck/pocketfft) registered for path 'third_party/pocketfft' 2025-09-07T07:38:47.6977899Z Submodule 'third_party/protobuf' (https://github.com/protocolbuffers/protobuf.git) registered for path 'third_party/protobuf' 2025-09-07T07:38:47.6991726Z Submodule 'third_party/NNPACK_deps/psimd' (https://github.com/Maratyszcza/psimd.git) registered for path 'third_party/psimd' 2025-09-07T07:38:47.6994325Z Submodule 'third_party/NNPACK_deps/pthreadpool' (https://github.com/Maratyszcza/pthreadpool.git) registered for path 'third_party/pthreadpool' 2025-09-07T07:38:47.6996780Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/pybind11' 2025-09-07T07:38:47.6999515Z Submodule 'third_party/python-peachpy' (https://github.com/malfet/PeachPy.git) registered for path 'third_party/python-peachpy' 2025-09-07T07:38:47.7003265Z Submodule 'third_party/sleef' (https://github.com/shibatch/sleef) registered for path 'third_party/sleef' 2025-09-07T07:38:47.7017403Z Submodule 'third_party/tensorpipe' (https://github.com/pytorch/tensorpipe.git) registered for path 'third_party/tensorpipe' 2025-09-07T07:38:47.7041337Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/android/libs/fbjni'... 2025-09-07T07:38:47.9211440Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FXdiv'... 2025-09-07T07:38:47.9212099Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/FP16'... 2025-09-07T07:38:47.9212539Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/psimd'... 2025-09-07T07:38:47.9212974Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NNPACK'... 2025-09-07T07:38:47.9345327Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/NVTX'... 2025-09-07T07:38:48.0805198Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pocketfft'... 2025-09-07T07:38:48.0805980Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pthreadpool'... 2025-09-07T07:38:48.0806508Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/python-peachpy'... 2025-09-07T07:38:48.0806973Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep'... 2025-09-07T07:38:48.0807468Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gloo'... 2025-09-07T07:38:48.0807928Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/gemmlowp/gemmlowp'... 2025-09-07T07:38:48.0870736Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/VulkanMemoryAllocator'... 2025-09-07T07:38:48.9654912Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/benchmark'... 2025-09-07T07:38:48.9655488Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe'... 2025-09-07T07:38:48.9655974Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kleidiai'... 2025-09-07T07:38:48.9656421Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ittapi'... 2025-09-07T07:38:48.9656881Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention'... 2025-09-07T07:38:48.9657375Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpp-httplib'... 2025-09-07T07:38:48.9657829Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cpuinfo'... 2025-09-07T07:38:48.9658278Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/googletest'... 2025-09-07T07:38:48.9658725Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/sleef'... 2025-09-07T07:38:48.9659170Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/mimalloc'... 2025-09-07T07:38:48.9659833Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/pybind11'... 2025-09-07T07:38:48.9660270Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fmt'... 2025-09-07T07:38:48.9660700Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto'... 2025-09-07T07:38:48.9996451Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/XNNPACK'... 2025-09-07T07:38:56.9305747Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cudnn_frontend'... 2025-09-07T07:38:56.9306226Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flatbuffers'... 2025-09-07T07:38:56.9306620Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm'... 2025-09-07T07:38:56.9306993Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/cutlass'... 2025-09-07T07:38:56.9307405Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx'... 2025-09-07T07:38:56.9307816Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/composable_kernel'... 2025-09-07T07:38:56.9308213Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/aiter'... 2025-09-07T07:38:56.9308613Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp'... 2025-09-07T07:38:56.9309012Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/nlohmann'... 2025-09-07T07:38:56.9309393Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf'... 2025-09-07T07:38:56.9420691Z Submodule path 'android/libs/fbjni': checked out '7e1e1fe3858c63c251c637ae41a20de425dde96f' 2025-09-07T07:38:56.9513283Z Submodule path 'third_party/FP16': checked out '4dfe081cf6bcd15db339cf2680b9281b8451eeb3' 2025-09-07T07:38:56.9584884Z Submodule path 'third_party/FXdiv': checked out 'b408327ac2a15ec3e43352421954f5b1967701d1' 2025-09-07T07:38:56.9766388Z Submodule path 'third_party/NNPACK': checked out 'c07e3a0400713d546e0dea2d5466dd22ea389c73' 2025-09-07T07:38:57.0369987Z Submodule path 'third_party/NVTX': checked out '2942f167cc30c5e3a44a2aecd5b0d9c07ff61a07' 2025-09-07T07:38:57.0780592Z Submodule path 'third_party/VulkanMemoryAllocator': checked out '1d8f600fd424278486eade7ed3e877c99f0846b1' 2025-09-07T07:38:57.5739131Z Submodule path 'third_party/XNNPACK': checked out '51a0103656eff6fc9bfd39a4597923c4b542c883' 2025-09-07T07:38:57.6925398Z Submodule path 'third_party/aiter': checked out '01aae101b9e5e94d6c16a9514c9fb8df99c93150' 2025-09-07T07:38:57.6929220Z Submodule '3rdparty/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:38:57.6950012Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/aiter/3rdparty/composable_kernel'... 2025-09-07T07:39:00.7128840Z Submodule path 'third_party/aiter/3rdparty/composable_kernel': checked out 'cffe8fa2a442ac8e80dd236a1a5d24fe3d7e0cbf' 2025-09-07T07:39:00.7298371Z Submodule path 'third_party/benchmark': checked out '299e5928955cc62af9968370293b916f5130916f' 2025-09-07T07:39:00.9556753Z Submodule path 'third_party/composable_kernel': checked out '7fe50dc3da2069d6645d9deb8c017a876472a977' 2025-09-07T07:39:00.9896395Z Submodule path 'third_party/cpp-httplib': checked out '89c932f313c6437c38f2982869beacc89c2f2246' 2025-09-07T07:39:01.0662813Z Submodule path 'third_party/cpuinfo': checked out '5e3d2445e6a84d9599bee2bf78edbb4d80865e1d' 2025-09-07T07:39:01.1002188Z Submodule path 'third_party/cudnn_frontend': checked out 'f937055efc6d414d11f4c6577e3977fe74f35fb6' 2025-09-07T07:39:01.5745493Z Submodule path 'third_party/cutlass': checked out 'e51efbfe18fe4f4cbb66ab814c55bf4aa0185491' 2025-09-07T07:39:01.6814550Z Submodule path 'third_party/fbgemm': checked out '4b39c551efe15e6bbade20565b0ceb2d8ce3352d' 2025-09-07T07:39:01.6826476Z Submodule 'external/asmjit' (https://github.com/asmjit/asmjit.git) registered for path 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:01.6827742Z Submodule 'external/composable_kernel' (https://github.com/jwfromm/composable_kernel.git) registered for path 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:01.6828913Z Submodule 'external/cpuinfo' (https://github.com/pytorch/cpuinfo) registered for path 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:01.6830207Z Submodule 'external/cutlass' (https://github.com/jwfromm/cutlass) registered for path 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:01.6831531Z Submodule 'external/googletest' (https://github.com/google/googletest) registered for path 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:01.6833333Z Submodule 'external/hipify_torch' (https://github.com/ROCmSoftwarePlatform/hipify_torch.git) registered for path 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:01.6834627Z Submodule 'external/json' (https://github.com/nlohmann/json.git) registered for path 'third_party/fbgemm/external/json' 2025-09-07T07:39:01.6855983Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/asmjit'... 2025-09-07T07:39:02.6942971Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/hipify_torch'... 2025-09-07T07:39:02.6943519Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/cpuinfo'... 2025-09-07T07:39:02.6943998Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/googletest'... 2025-09-07T07:39:02.6944485Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/composable_kernel'... 2025-09-07T07:39:02.7943089Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/cutlass'... 2025-09-07T07:39:03.4486141Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/fbgemm/external/json'... 2025-09-07T07:39:06.9336318Z Submodule path 'third_party/fbgemm/external/asmjit': checked out 'a3199e8857792cd10b7589ff5d58343d2c9008ea' 2025-09-07T07:39:07.1148041Z Submodule path 'third_party/fbgemm/external/composable_kernel': checked out 'b1281b8b08d973a7064f864f47eeb30f3e2596e9' 2025-09-07T07:39:07.1932532Z Submodule path 'third_party/fbgemm/external/cpuinfo': checked out '6543fec09b2f04ac4a666882998b534afc9c1349' 2025-09-07T07:39:07.6618035Z Submodule path 'third_party/fbgemm/external/cutlass': checked out '311f3c8e51dc0eb56310cfc6980bf63d0fbd7917' 2025-09-07T07:39:07.6978867Z Submodule path 'third_party/fbgemm/external/googletest': checked out '52eb8108c5bdec04579160ae17225d66034bd723' 2025-09-07T07:39:07.7068265Z Submodule path 'third_party/fbgemm/external/hipify_torch': checked out '63b6a7b541fa7f08f8475ca7d74054db36ff2691' 2025-09-07T07:39:07.7870738Z Submodule path 'third_party/fbgemm/external/json': checked out '9cca280a4d0ccf0c08f47a99aa71d1b0e52f8d03' 2025-09-07T07:39:07.8381651Z Submodule path 'third_party/flash-attention': checked out '979702c87a8713a8e0a5e9fee122b90d2ef13be5' 2025-09-07T07:39:07.8393366Z Submodule 'csrc/composable_kernel' (https://github.com/ROCm/composable_kernel.git) registered for path 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:07.8394462Z Submodule 'csrc/cutlass' (https://github.com/NVIDIA/cutlass.git) registered for path 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:07.8413465Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/composable_kernel'... 2025-09-07T07:39:10.6573252Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/flash-attention/csrc/cutlass'... 2025-09-07T07:39:10.8223328Z Submodule path 'third_party/flash-attention/csrc/composable_kernel': checked out '888317e698e9803c62bd38568abc9e05d7709f33' 2025-09-07T07:39:11.2480770Z Submodule path 'third_party/flash-attention/csrc/cutlass': checked out 'c506e16788cb08416a4a57e11a9067beeee29420' 2025-09-07T07:39:11.3457379Z Submodule path 'third_party/flatbuffers': checked out 'a2cd1ea3b6d3fee220106b5fed3f7ce8da9eb757' 2025-09-07T07:39:11.3711731Z Submodule path 'third_party/fmt': checked out '40626af88bd7df9a5fb80be7b25ac85b122d6c21' 2025-09-07T07:39:11.4008075Z Submodule path 'third_party/gemmlowp/gemmlowp': checked out '3fb5c176c17c765a3492cd2f0321b0dab712f350' 2025-09-07T07:39:11.4184990Z Submodule path 'third_party/gloo': checked out 'c7b7b022c124d9643957d9bd55f57ac59fce8fa2' 2025-09-07T07:39:11.4526929Z Submodule path 'third_party/googletest': checked out '52eb8108c5bdec04579160ae17225d66034bd723' 2025-09-07T07:39:11.4621770Z Submodule path 'third_party/ideep': checked out '719d8e6cd7f7a0e01b155657526d693acf97c2b3' 2025-09-07T07:39:11.4633853Z Submodule 'mkl-dnn' (https://github.com/intel/mkl-dnn.git) registered for path 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:11.4652413Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/ideep/mkl-dnn'... 2025-09-07T07:39:22.5307009Z Submodule path 'third_party/ideep/mkl-dnn': checked out '8d263e693366ef8db40acc569cc7d8edf644556d' 2025-09-07T07:39:22.5460038Z Submodule path 'third_party/ittapi': checked out 'dec1d23ca65ab069d225dfe40dea14f455170959' 2025-09-07T07:39:22.6233120Z Submodule path 'third_party/kineto': checked out '5e7501833f1021ce6f618572d3baf657b6319658' 2025-09-07T07:39:22.6243029Z Submodule 'libkineto/third_party/dynolog' (https://github.com/facebookincubator/dynolog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:22.6244092Z Submodule 'libkineto/third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:22.6245432Z Submodule 'libkineto/third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:22.6267173Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog'... 2025-09-07T07:39:23.2006817Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/fmt'... 2025-09-07T07:39:23.4576610Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/googletest'... 2025-09-07T07:39:23.5219409Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog': checked out '7d04a0053a845370ae06ce317a22a48e9edcc74e' 2025-09-07T07:39:23.5227676Z Submodule 'third_party/DCGM' (https://github.com/NVIDIA/DCGM.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:23.5229055Z Submodule 'third_party/cpr' (https://github.com/libcpr/cpr.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:23.5230297Z Submodule 'third_party/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:23.5231620Z Submodule 'third_party/gflags' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:23.5232928Z Submodule 'third_party/glog' (https://github.com/google/glog.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:23.5234414Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:23.5235910Z Submodule 'third_party/json' (https://github.com/nlohmann/json.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:23.5237317Z Submodule 'third_party/pfs' (https://github.com/dtrugman/pfs.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:23.5260590Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM'... 2025-09-07T07:39:24.6811178Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/pfs'... 2025-09-07T07:39:24.6811948Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags'... 2025-09-07T07:39:24.6812660Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/cpr'... 2025-09-07T07:39:24.6813326Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/glog'... 2025-09-07T07:39:24.6814006Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/googletest'... 2025-09-07T07:39:24.6814689Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/fmt'... 2025-09-07T07:39:24.7811900Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/json'... 2025-09-07T07:39:28.9153231Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM': checked out 'ffde4e54bc7249a6039a5e6b45b395141e1217f9' 2025-09-07T07:39:28.9284091Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr': checked out '871ed52d350214a034f6ef8a3b8f51c5ce1bd400' 2025-09-07T07:39:28.9555709Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt': checked out 'cd4af11efc9c622896a3e4cb599fa28668ca3d05' 2025-09-07T07:39:28.9655989Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags': checked out 'e171aa2d15ed9eb17054558e0b3a6a413bb01067' 2025-09-07T07:39:28.9665731Z Submodule 'doc' (https://github.com/gflags/gflags.git) registered for path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:28.9687665Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc'... 2025-09-07T07:39:29.6029156Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc': checked out '8411df715cf522606e3b1aca386ddfc0b63d34b4' 2025-09-07T07:39:29.6169906Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog': checked out 'b33e3bad4c46c8a6345525fd822af355e5ef9446' 2025-09-07T07:39:29.6481673Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest': checked out '58d77fa8070e8cec2dc1ed015d66b454c8d78850' 2025-09-07T07:39:29.7239092Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/json': checked out '4f8fba14066156b73f1189a2b8bd568bde5284c5' 2025-09-07T07:39:29.7360607Z Submodule path 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs': checked out 'f68a2fa8ea36c783bdd760371411fcb495aa3150' 2025-09-07T07:39:29.7628551Z Submodule path 'third_party/kineto/libkineto/third_party/fmt': checked out '0041a40c1350ba702d475b9c4ad62da77caea164' 2025-09-07T07:39:29.8088012Z Submodule path 'third_party/kineto/libkineto/third_party/googletest': checked out '7aca84427f224eeed3144123d5230d5871e93347' 2025-09-07T07:39:29.8421867Z Submodule path 'third_party/kleidiai': checked out 'cca02c2f69dd18e1f12647c1c0bdc8cf90e680c7' 2025-09-07T07:39:29.8718122Z Submodule path 'third_party/mimalloc': checked out 'fbd8b99c2b828428947d70fdc046bb55609be93e' 2025-09-07T07:39:29.9592549Z Submodule path 'third_party/nlohmann': checked out '55f93686c01528224f448c19128836e7df245f72' 2025-09-07T07:39:30.2238702Z Submodule path 'third_party/onnx': checked out 'e709452ef2bbc1d113faf678c24e6d3467696e83' 2025-09-07T07:39:30.2261336Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:30.2281310Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/onnx/third_party/pybind11'... 2025-09-07T07:39:31.1783604Z Submodule path 'third_party/onnx/third_party/pybind11': checked out 'a2e59f0e7065404b44dfe92a28aca47ba1378dc4' 2025-09-07T07:39:31.2276899Z Submodule path 'third_party/opentelemetry-cpp': checked out 'a799f4aed9c94b765dcdaabaeab7d5e7e2310878' 2025-09-07T07:39:31.2288116Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark) registered for path 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:31.2289320Z Submodule 'third_party/googletest' (https://github.com/google/googletest) registered for path 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:31.2290852Z Submodule 'third_party/ms-gsl' (https://github.com/microsoft/GSL) registered for path 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:31.2292538Z Submodule 'third_party/nlohmann-json' (https://github.com/nlohmann/json) registered for path 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:31.2293983Z Submodule 'third_party/opentelemetry-proto' (https://github.com/open-telemetry/opentelemetry-proto) registered for path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:31.2295376Z Submodule 'third_party/opentracing-cpp' (https://github.com/opentracing/opentracing-cpp.git) registered for path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:31.2296750Z Submodule 'third_party/prometheus-cpp' (https://github.com/jupp0r/prometheus-cpp) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:31.2298191Z Submodule 'tools/vcpkg' (https://github.com/Microsoft/vcpkg) registered for path 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:31.2320863Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/benchmark'... 2025-09-07T07:39:31.6800767Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentracing-cpp'... 2025-09-07T07:39:31.6801739Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/opentelemetry-proto'... 2025-09-07T07:39:31.6802347Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp'... 2025-09-07T07:39:31.6802892Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/ms-gsl'... 2025-09-07T07:39:31.7799373Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/googletest'... 2025-09-07T07:39:32.2799574Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/nlohmann-json'... 2025-09-07T07:39:37.3220438Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/tools/vcpkg'... 2025-09-07T07:39:37.3367413Z Submodule path 'third_party/opentelemetry-cpp/third_party/benchmark': checked out 'd572f4777349d43653b21d6c2fc63020ab326db2' 2025-09-07T07:39:37.3683469Z Submodule path 'third_party/opentelemetry-cpp/third_party/googletest': checked out 'b796f7d44681514f58a683a3a71ff17c94edb0c1' 2025-09-07T07:39:37.3816379Z Submodule path 'third_party/opentelemetry-cpp/third_party/ms-gsl': checked out '6f4529395c5b7c2d661812257cd6780c67e54afa' 2025-09-07T07:39:37.4622165Z Submodule path 'third_party/opentelemetry-cpp/third_party/nlohmann-json': checked out 'bc889afb4c5bf1c0d8ee29ef35eaaf4c8bef8a5d' 2025-09-07T07:39:37.4723843Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto': checked out '4ca4f0335c63cda7ab31ea7ed70d6553aee14dce' 2025-09-07T07:39:37.4835810Z Submodule path 'third_party/opentelemetry-cpp/third_party/opentracing-cpp': checked out '06b57f48ded1fa3bdd3d4346f6ef29e40e08eaf5' 2025-09-07T07:39:37.4945201Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp': checked out 'c9ffcdda9086ffd9e1283ea7a0276d831f3c8a8d' 2025-09-07T07:39:37.4956627Z Submodule 'civetweb' (https://github.com/civetweb/civetweb.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:37.4957838Z Submodule 'googletest' (https://github.com/google/googletest.git) registered for path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:37.4977730Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb'... 2025-09-07T07:39:38.8916140Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest'... 2025-09-07T07:39:39.0892586Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb': checked out 'eefb26f82b233268fc98577d265352720d477ba4' 2025-09-07T07:39:39.1266203Z Submodule path 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest': checked out 'e2239ee6043f73722e7aa812a459f54a28552929' 2025-09-07T07:39:39.4328839Z Submodule path 'third_party/opentelemetry-cpp/tools/vcpkg': checked out '8eb57355a4ffb410a2e94c07b4dca2dffbee8e50' 2025-09-07T07:39:39.4417485Z Submodule path 'third_party/pocketfft': checked out '0fa0ef591e38c2758e3184c6c23e497b9f732ffa' 2025-09-07T07:39:39.6458296Z Submodule path 'third_party/protobuf': checked out 'd1eca4e4b421cd2997495c4b4e65cea6be4e9b8a' 2025-09-07T07:39:39.6470296Z Submodule 'third_party/benchmark' (https://github.com/google/benchmark.git) registered for path 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:39.6471532Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:39.6493895Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/benchmark'... 2025-09-07T07:39:40.2830906Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/protobuf/third_party/googletest'... 2025-09-07T07:39:40.4017364Z Submodule path 'third_party/protobuf/third_party/benchmark': checked out '5b7683f49e1e9223cf9927b24f6fd3d6bd82e3f8' 2025-09-07T07:39:40.4567060Z Submodule path 'third_party/protobuf/third_party/googletest': checked out '5ec7f0c4a113e2f18ac2c6cc7df51ad6afc24081' 2025-09-07T07:39:40.4635437Z Submodule path 'third_party/psimd': checked out '072586a71b55b7f8c584153d223e95687148a900' 2025-09-07T07:39:40.4728510Z Submodule path 'third_party/pthreadpool': checked out '4fe0e1e183925bf8cfa6aae24237e724a96479b8' 2025-09-07T07:39:40.5025681Z Submodule path 'third_party/pybind11': checked out 'f5fbe867d2d26e4a0a9177a51f6e568868ad3dc8' 2025-09-07T07:39:40.5233433Z Submodule path 'third_party/python-peachpy': checked out 'f45429b087dd7d5bc78bb40dc7cf06425c252d67' 2025-09-07T07:39:40.5561435Z Submodule path 'third_party/sleef': checked out '5a1d179df9cf652951b59010a2d2075372d67f68' 2025-09-07T07:39:40.5750902Z Submodule path 'third_party/tensorpipe': checked out 'af0118d13e52f5a08841464a768e01a0bf3e3075' 2025-09-07T07:39:40.5760683Z Submodule 'third_party/googletest' (https://github.com/google/googletest.git) registered for path 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:40.5761415Z Submodule 'third_party/libnop' (https://github.com/google/libnop.git) registered for path 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:40.5762697Z Submodule 'third_party/libuv' (https://github.com/libuv/libuv.git) registered for path 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:40.5763938Z Submodule 'third_party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:40.5785706Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/googletest'... 2025-09-07T07:39:41.3230911Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libnop'... 2025-09-07T07:39:41.3657152Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/libuv'... 2025-09-07T07:39:41.5748874Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11'... 2025-09-07T07:39:41.6194159Z Submodule path 'third_party/tensorpipe/third_party/googletest': checked out 'aee0f9d9b5b87796ee8a0ab26b7587ec30e8858e' 2025-09-07T07:39:41.6310111Z Submodule path 'third_party/tensorpipe/third_party/libnop': checked out '910b55815be16109f04f4180e9adee14fb4ce281' 2025-09-07T07:39:41.6859328Z Submodule path 'third_party/tensorpipe/third_party/libuv': checked out '5152db2cbfeb5582e9c27c5ea1dba2cd9e10759b' 2025-09-07T07:39:41.7075780Z Submodule path 'third_party/tensorpipe/third_party/pybind11': checked out 'a23996fce38ff6ccfbcdc09f1e63f2c4be5ea2ef' 2025-09-07T07:39:41.7086229Z Submodule 'tools/clang' (https://github.com/wjakob/clang-cindex-python3) registered for path 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:41.7106305Z Cloning into '/home/ec2-user/actions-runner/_work/pytorch/pytorch/third_party/tensorpipe/third_party/pybind11/tools/clang'... 2025-09-07T07:39:41.9392564Z Submodule path 'third_party/tensorpipe/third_party/pybind11/tools/clang': checked out '6a00cbc4a9b8e68b71caf7f774b3f9c753ae84d5' 2025-09-07T07:39:41.9422151Z [command]/usr/bin/git submodule foreach --recursive git config --local gc.auto 0 2025-09-07T07:39:41.9678620Z Entering 'android/libs/fbjni' 2025-09-07T07:39:41.9710161Z Entering 'third_party/FP16' 2025-09-07T07:39:41.9743007Z Entering 'third_party/FXdiv' 2025-09-07T07:39:41.9776764Z Entering 'third_party/NNPACK' 2025-09-07T07:39:41.9812448Z Entering 'third_party/NVTX' 2025-09-07T07:39:41.9843224Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:41.9876845Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:41.9918770Z Entering 'third_party/aiter' 2025-09-07T07:39:41.9948902Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:41.9984190Z Entering 'third_party/benchmark' 2025-09-07T07:39:42.0016275Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:42.0055964Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:42.0090262Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:42.0127076Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:42.0159894Z Entering 'third_party/cutlass' 2025-09-07T07:39:42.0199011Z Entering 'third_party/fbgemm' 2025-09-07T07:39:42.0228726Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:42.0257927Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:42.0295269Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:42.0327220Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:42.0366144Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:42.0398510Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:42.0426316Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:42.0462518Z Entering 'third_party/flash-attention' 2025-09-07T07:39:42.0496717Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:42.0531919Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:42.0569613Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:42.0606401Z Entering 'third_party/fmt' 2025-09-07T07:39:42.0638960Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:42.0673027Z Entering 'third_party/gloo' 2025-09-07T07:39:42.0707938Z Entering 'third_party/googletest' 2025-09-07T07:39:42.0739504Z Entering 'third_party/ideep' 2025-09-07T07:39:42.0771089Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:42.0809702Z Entering 'third_party/ittapi' 2025-09-07T07:39:42.0841890Z Entering 'third_party/kineto' 2025-09-07T07:39:42.0874834Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:42.0907996Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:42.0940103Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:42.0971253Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:42.1003199Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:42.1032490Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:42.1065877Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:42.1098299Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:42.1127770Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:42.1161177Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:42.1198090Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:42.1230916Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:42.1261604Z Entering 'third_party/kleidiai' 2025-09-07T07:39:42.1295699Z Entering 'third_party/mimalloc' 2025-09-07T07:39:42.1327012Z Entering 'third_party/nlohmann' 2025-09-07T07:39:42.1361529Z Entering 'third_party/onnx' 2025-09-07T07:39:42.1407909Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:42.1442205Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:42.1477478Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:42.1509955Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:42.1541722Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:42.1573260Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:42.1604770Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:42.1636980Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:42.1669307Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:42.1701859Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:42.1736338Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:42.1767153Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:42.1811632Z Entering 'third_party/pocketfft' 2025-09-07T07:39:42.1844722Z Entering 'third_party/protobuf' 2025-09-07T07:39:42.1878063Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:42.1911173Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:42.1945609Z Entering 'third_party/psimd' 2025-09-07T07:39:42.1979964Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:42.2013406Z Entering 'third_party/pybind11' 2025-09-07T07:39:42.2049322Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:42.2082771Z Entering 'third_party/sleef' 2025-09-07T07:39:42.2111621Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:42.2143221Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:42.2175371Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:42.2207285Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:42.2239473Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:42.2269563Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:42.2315811Z ##[endgroup] 2025-09-07T07:39:42.2316152Z ##[group]Persisting credentials for submodules 2025-09-07T07:39:42.2321271Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'url\.https\:\/\/github\.com\/\.insteadOf' && git config --local --unset-all 'url.https://github.com/.insteadOf' || :" 2025-09-07T07:39:42.2575209Z Entering 'android/libs/fbjni' 2025-09-07T07:39:42.2618439Z Entering 'third_party/FP16' 2025-09-07T07:39:42.2661829Z Entering 'third_party/FXdiv' 2025-09-07T07:39:42.2707950Z Entering 'third_party/NNPACK' 2025-09-07T07:39:42.2749155Z Entering 'third_party/NVTX' 2025-09-07T07:39:42.2791478Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:42.2832728Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:42.2884370Z Entering 'third_party/aiter' 2025-09-07T07:39:42.2927585Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:42.2977974Z Entering 'third_party/benchmark' 2025-09-07T07:39:42.3020615Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:42.3067382Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:42.3112323Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:42.3158417Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:42.3201369Z Entering 'third_party/cutlass' 2025-09-07T07:39:42.3249195Z Entering 'third_party/fbgemm' 2025-09-07T07:39:42.3294364Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:42.3338463Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:42.3388166Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:42.3429176Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:42.3480886Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:42.3521582Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:42.3564401Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:42.3612267Z Entering 'third_party/flash-attention' 2025-09-07T07:39:42.3653063Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:42.3700666Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:42.3751555Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:42.3798023Z Entering 'third_party/fmt' 2025-09-07T07:39:42.3843832Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:42.3889920Z Entering 'third_party/gloo' 2025-09-07T07:39:42.3933739Z Entering 'third_party/googletest' 2025-09-07T07:39:42.3975854Z Entering 'third_party/ideep' 2025-09-07T07:39:42.4019455Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:42.4068279Z Entering 'third_party/ittapi' 2025-09-07T07:39:42.4111818Z Entering 'third_party/kineto' 2025-09-07T07:39:42.4156349Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:42.4199544Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:42.4241877Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:42.4283619Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:42.4325192Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:42.4365116Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:42.4408637Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:42.4453926Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:42.4499151Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:42.4544006Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:42.4592425Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:42.4631494Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:42.4675390Z Entering 'third_party/kleidiai' 2025-09-07T07:39:42.4717948Z Entering 'third_party/mimalloc' 2025-09-07T07:39:42.4760761Z Entering 'third_party/nlohmann' 2025-09-07T07:39:42.4802109Z Entering 'third_party/onnx' 2025-09-07T07:39:42.4859859Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:42.4905682Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:42.4946108Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:42.4991646Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:42.5036227Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:42.5079084Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:42.5122775Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:42.5163682Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:42.5205468Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:42.5244766Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:42.5288960Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:42.5331462Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:42.5389198Z Entering 'third_party/pocketfft' 2025-09-07T07:39:42.5430279Z Entering 'third_party/protobuf' 2025-09-07T07:39:42.5475950Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:42.5517992Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:42.5563607Z Entering 'third_party/psimd' 2025-09-07T07:39:42.5608472Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:42.5652529Z Entering 'third_party/pybind11' 2025-09-07T07:39:42.5697031Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:42.5742947Z Entering 'third_party/sleef' 2025-09-07T07:39:42.5787320Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:42.5828937Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:42.5869810Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:42.5913012Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:42.5958676Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:42.6000400Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:42.6067906Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local 'http.https://github.com/.extraheader' 'AUTHORIZATION: basic ***' && git config --local --show-origin --name-only --get-regexp remote.origin.url" 2025-09-07T07:39:42.6318027Z Entering 'android/libs/fbjni' 2025-09-07T07:39:42.6357992Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/android/libs/fbjni/config remote.origin.url 2025-09-07T07:39:42.6369961Z Entering 'third_party/FP16' 2025-09-07T07:39:42.6409401Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FP16/config remote.origin.url 2025-09-07T07:39:42.6421359Z Entering 'third_party/FXdiv' 2025-09-07T07:39:42.6460858Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/FXdiv/config remote.origin.url 2025-09-07T07:39:42.6474983Z Entering 'third_party/NNPACK' 2025-09-07T07:39:42.6518832Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK/config remote.origin.url 2025-09-07T07:39:42.6533311Z Entering 'third_party/NVTX' 2025-09-07T07:39:42.6575914Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NVTX/config remote.origin.url 2025-09-07T07:39:42.6589016Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:42.6624030Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/VulkanMemoryAllocator/config remote.origin.url 2025-09-07T07:39:42.6636705Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:42.6676233Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/XNNPACK/config remote.origin.url 2025-09-07T07:39:42.6700147Z Entering 'third_party/aiter' 2025-09-07T07:39:42.6741572Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/aiter/config remote.origin.url 2025-09-07T07:39:42.6756419Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:42.6797290Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/aiter/modules/3rdparty/composable_kernel/config remote.origin.url 2025-09-07T07:39:42.6815907Z Entering 'third_party/benchmark' 2025-09-07T07:39:42.6857184Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/benchmark/config remote.origin.url 2025-09-07T07:39:42.6872052Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:42.6911394Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/composable_kernel/config remote.origin.url 2025-09-07T07:39:42.6929591Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:42.6966490Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpp-httplib/config remote.origin.url 2025-09-07T07:39:42.6978881Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:42.7021752Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cpuinfo/config remote.origin.url 2025-09-07T07:39:42.7034824Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:42.7075983Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cudnn_frontend/config remote.origin.url 2025-09-07T07:39:42.7088110Z Entering 'third_party/cutlass' 2025-09-07T07:39:42.7127356Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/cutlass/config remote.origin.url 2025-09-07T07:39:42.7145177Z Entering 'third_party/fbgemm' 2025-09-07T07:39:42.7186900Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/config remote.origin.url 2025-09-07T07:39:42.7201382Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:42.7242679Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/asmjit/config remote.origin.url 2025-09-07T07:39:42.7254858Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:42.7297695Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/composable_kernel/config remote.origin.url 2025-09-07T07:39:42.7314114Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:42.7348697Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/cpuinfo/config remote.origin.url 2025-09-07T07:39:42.7361603Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:42.7401496Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/cutlass/config remote.origin.url 2025-09-07T07:39:42.7419499Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:42.7456691Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/googletest/config remote.origin.url 2025-09-07T07:39:42.7471357Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:42.7511070Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/hipify_torch/config remote.origin.url 2025-09-07T07:39:42.7523478Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:42.7563009Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fbgemm/modules/external/json/config remote.origin.url 2025-09-07T07:39:42.7578980Z Entering 'third_party/flash-attention' 2025-09-07T07:39:42.7619300Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/config remote.origin.url 2025-09-07T07:39:42.7630530Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:42.7667524Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/composable_kernel/config remote.origin.url 2025-09-07T07:39:42.7685200Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:42.7724095Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flash-attention/modules/csrc/cutlass/config remote.origin.url 2025-09-07T07:39:42.7742335Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:42.7784293Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/flatbuffers/config remote.origin.url 2025-09-07T07:39:42.7799234Z Entering 'third_party/fmt' 2025-09-07T07:39:42.7839870Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/fmt/config remote.origin.url 2025-09-07T07:39:42.7852863Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:42.7893963Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gemmlowp/gemmlowp/config remote.origin.url 2025-09-07T07:39:42.7906638Z Entering 'third_party/gloo' 2025-09-07T07:39:42.7946470Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/gloo/config remote.origin.url 2025-09-07T07:39:42.7958811Z Entering 'third_party/googletest' 2025-09-07T07:39:42.7999444Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/googletest/config remote.origin.url 2025-09-07T07:39:42.8011310Z Entering 'third_party/ideep' 2025-09-07T07:39:42.8052120Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/config remote.origin.url 2025-09-07T07:39:42.8065020Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:42.8103273Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ideep/modules/mkl-dnn/config remote.origin.url 2025-09-07T07:39:42.8119991Z Entering 'third_party/ittapi' 2025-09-07T07:39:42.8157708Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/ittapi/config remote.origin.url 2025-09-07T07:39:42.8171056Z Entering 'third_party/kineto' 2025-09-07T07:39:42.8212171Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/config remote.origin.url 2025-09-07T07:39:42.8224780Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:42.8263875Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/config remote.origin.url 2025-09-07T07:39:42.8275483Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:42.8315844Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/DCGM/config remote.origin.url 2025-09-07T07:39:42.8330273Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:42.8368143Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/cpr/config remote.origin.url 2025-09-07T07:39:42.8380307Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:42.8420487Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/fmt/config remote.origin.url 2025-09-07T07:39:42.8431578Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:42.8469010Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/config remote.origin.url 2025-09-07T07:39:42.8481645Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:42.8521930Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/gflags/modules/doc/config remote.origin.url 2025-09-07T07:39:42.8536836Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:42.8578779Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/glog/config remote.origin.url 2025-09-07T07:39:42.8590929Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:42.8629998Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/googletest/config remote.origin.url 2025-09-07T07:39:42.8642002Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:42.8678691Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/json/config remote.origin.url 2025-09-07T07:39:42.8692343Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:42.8731946Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/dynolog/modules/third_party/pfs/config remote.origin.url 2025-09-07T07:39:42.8747099Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:42.8788382Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/fmt/config remote.origin.url 2025-09-07T07:39:42.8799845Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:42.8839400Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kineto/modules/libkineto/third_party/googletest/config remote.origin.url 2025-09-07T07:39:42.8853745Z Entering 'third_party/kleidiai' 2025-09-07T07:39:42.8893503Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/kleidiai/config remote.origin.url 2025-09-07T07:39:42.8905656Z Entering 'third_party/mimalloc' 2025-09-07T07:39:42.8944782Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/mimalloc/config remote.origin.url 2025-09-07T07:39:42.8957797Z Entering 'third_party/nlohmann' 2025-09-07T07:39:42.9000575Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/nlohmann/config remote.origin.url 2025-09-07T07:39:42.9014093Z Entering 'third_party/onnx' 2025-09-07T07:39:42.9071539Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/config remote.origin.url 2025-09-07T07:39:42.9095917Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:42.9136427Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/onnx/modules/third_party/pybind11/config remote.origin.url 2025-09-07T07:39:42.9151184Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:42.9193533Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/config remote.origin.url 2025-09-07T07:39:42.9207420Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:42.9244453Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/benchmark/config remote.origin.url 2025-09-07T07:39:42.9256117Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:42.9293237Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/googletest/config remote.origin.url 2025-09-07T07:39:42.9305975Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:42.9343315Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/ms-gsl/config remote.origin.url 2025-09-07T07:39:42.9355805Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:42.9396888Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/nlohmann-json/config remote.origin.url 2025-09-07T07:39:42.9408943Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:42.9446553Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentelemetry-proto/config remote.origin.url 2025-09-07T07:39:42.9458210Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:42.9499762Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/opentracing-cpp/config remote.origin.url 2025-09-07T07:39:42.9511661Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:42.9549614Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/config remote.origin.url 2025-09-07T07:39:42.9561408Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:42.9604830Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/civetweb/config remote.origin.url 2025-09-07T07:39:42.9617741Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:42.9656629Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/third_party/prometheus-cpp/modules/googletest/config remote.origin.url 2025-09-07T07:39:42.9671410Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:42.9711163Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/opentelemetry-cpp/modules/tools/vcpkg/config remote.origin.url 2025-09-07T07:39:42.9737804Z Entering 'third_party/pocketfft' 2025-09-07T07:39:42.9780512Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pocketfft/config remote.origin.url 2025-09-07T07:39:42.9791554Z Entering 'third_party/protobuf' 2025-09-07T07:39:42.9831903Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/config remote.origin.url 2025-09-07T07:39:42.9847054Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:42.9888561Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/benchmark/config remote.origin.url 2025-09-07T07:39:42.9901750Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:42.9940292Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/protobuf/modules/third_party/googletest/config remote.origin.url 2025-09-07T07:39:42.9953513Z Entering 'third_party/psimd' 2025-09-07T07:39:42.9996435Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/psimd/config remote.origin.url 2025-09-07T07:39:43.0007363Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:43.0044180Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/NNPACK_deps/pthreadpool/config remote.origin.url 2025-09-07T07:39:43.0056934Z Entering 'third_party/pybind11' 2025-09-07T07:39:43.0099298Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/pybind11/config remote.origin.url 2025-09-07T07:39:43.0110834Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:43.0148977Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/python-peachpy/config remote.origin.url 2025-09-07T07:39:43.0161072Z Entering 'third_party/sleef' 2025-09-07T07:39:43.0203142Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/sleef/config remote.origin.url 2025-09-07T07:39:43.0215227Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:43.0254066Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/config remote.origin.url 2025-09-07T07:39:43.0268449Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:43.0309349Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/googletest/config remote.origin.url 2025-09-07T07:39:43.0320730Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:43.0358087Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libnop/config remote.origin.url 2025-09-07T07:39:43.0369800Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:43.0408505Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/libuv/config remote.origin.url 2025-09-07T07:39:43.0420430Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:43.0459332Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/config remote.origin.url 2025-09-07T07:39:43.0470364Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:43.0513624Z file:/home/ec2-user/actions-runner/_work/pytorch/pytorch/.git/modules/third_party/tensorpipe/modules/third_party/pybind11/modules/tools/clang/config remote.origin.url 2025-09-07T07:39:43.1014375Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'git@github.com:' 2025-09-07T07:39:43.1254852Z Entering 'android/libs/fbjni' 2025-09-07T07:39:43.1287119Z Entering 'third_party/FP16' 2025-09-07T07:39:43.1317031Z Entering 'third_party/FXdiv' 2025-09-07T07:39:43.1350928Z Entering 'third_party/NNPACK' 2025-09-07T07:39:43.1385968Z Entering 'third_party/NVTX' 2025-09-07T07:39:43.1416566Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:43.1448547Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:43.1489701Z Entering 'third_party/aiter' 2025-09-07T07:39:43.1521033Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:43.1558724Z Entering 'third_party/benchmark' 2025-09-07T07:39:43.1591843Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:43.1630732Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:43.1661379Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:43.1693808Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:43.1726850Z Entering 'third_party/cutlass' 2025-09-07T07:39:43.1765416Z Entering 'third_party/fbgemm' 2025-09-07T07:39:43.1800732Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:43.1834801Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:43.1872919Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:43.1903262Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:43.1937967Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:43.1968482Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:43.1999729Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:43.2035248Z Entering 'third_party/flash-attention' 2025-09-07T07:39:43.2070601Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:43.2108542Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:43.2143536Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:43.2177428Z Entering 'third_party/fmt' 2025-09-07T07:39:43.2210486Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:43.2243422Z Entering 'third_party/gloo' 2025-09-07T07:39:43.2277166Z Entering 'third_party/googletest' 2025-09-07T07:39:43.2309401Z Entering 'third_party/ideep' 2025-09-07T07:39:43.2342372Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:43.2377468Z Entering 'third_party/ittapi' 2025-09-07T07:39:43.2409070Z Entering 'third_party/kineto' 2025-09-07T07:39:43.2442081Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:43.2474385Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:43.2506587Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:43.2538744Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:43.2570183Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:43.2605235Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:43.2639612Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:43.2672103Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:43.2702309Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:43.2734679Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:43.2765989Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:43.2797085Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:43.2832656Z Entering 'third_party/kleidiai' 2025-09-07T07:39:43.2868293Z Entering 'third_party/mimalloc' 2025-09-07T07:39:43.2901998Z Entering 'third_party/nlohmann' 2025-09-07T07:39:43.2936039Z Entering 'third_party/onnx' 2025-09-07T07:39:43.2979700Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:43.3014457Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:43.3047604Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:43.3080226Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:43.3114663Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:43.3147242Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:43.3182418Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:43.3213007Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:43.3241947Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:43.3271617Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:43.3305867Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:43.3339948Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:43.3385330Z Entering 'third_party/pocketfft' 2025-09-07T07:39:43.3418206Z Entering 'third_party/protobuf' 2025-09-07T07:39:43.3455539Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:43.3487751Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:43.3521382Z Entering 'third_party/psimd' 2025-09-07T07:39:43.3551996Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:43.3585749Z Entering 'third_party/pybind11' 2025-09-07T07:39:43.3616604Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:43.3649794Z Entering 'third_party/sleef' 2025-09-07T07:39:43.3682868Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:43.3716351Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:43.3747269Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:43.3779211Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:43.3808999Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:43.3839548Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:43.3888121Z [command]/usr/bin/git submodule foreach --recursive git config --local --add 'url.https://github.com/.insteadOf' 'org-21003710@github.com:' 2025-09-07T07:39:43.4128331Z Entering 'android/libs/fbjni' 2025-09-07T07:39:43.4163069Z Entering 'third_party/FP16' 2025-09-07T07:39:43.4193888Z Entering 'third_party/FXdiv' 2025-09-07T07:39:43.4227750Z Entering 'third_party/NNPACK' 2025-09-07T07:39:43.4261257Z Entering 'third_party/NVTX' 2025-09-07T07:39:43.4297066Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:43.4327910Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:43.4371775Z Entering 'third_party/aiter' 2025-09-07T07:39:43.4423037Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:43.4456706Z Entering 'third_party/benchmark' 2025-09-07T07:39:43.4490655Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:43.4526903Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:43.4560145Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:43.4592910Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:43.4627077Z Entering 'third_party/cutlass' 2025-09-07T07:39:43.4665194Z Entering 'third_party/fbgemm' 2025-09-07T07:39:43.4700696Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:43.4732276Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:43.4768545Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:43.4800749Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:43.4837146Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:43.4869743Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:43.4901892Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:43.4937896Z Entering 'third_party/flash-attention' 2025-09-07T07:39:43.4975159Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:43.5012717Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:43.5049978Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:43.5085197Z Entering 'third_party/fmt' 2025-09-07T07:39:43.5119775Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:43.5153474Z Entering 'third_party/gloo' 2025-09-07T07:39:43.5186334Z Entering 'third_party/googletest' 2025-09-07T07:39:43.5217685Z Entering 'third_party/ideep' 2025-09-07T07:39:43.5249486Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:43.5288027Z Entering 'third_party/ittapi' 2025-09-07T07:39:43.5320033Z Entering 'third_party/kineto' 2025-09-07T07:39:43.5353528Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:43.5386336Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:43.5418497Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:43.5449766Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:43.5485030Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:43.5517032Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:43.5550260Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:43.5581687Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:43.5614655Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:43.5647885Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:43.5681040Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:43.5714684Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:43.5748220Z Entering 'third_party/kleidiai' 2025-09-07T07:39:43.5783242Z Entering 'third_party/mimalloc' 2025-09-07T07:39:43.5816181Z Entering 'third_party/nlohmann' 2025-09-07T07:39:43.5851107Z Entering 'third_party/onnx' 2025-09-07T07:39:43.5897679Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:43.5930401Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:43.5963293Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:43.5997575Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:43.6026841Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:43.6056569Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:43.6087945Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:43.6119796Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:43.6152229Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:43.6183710Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:43.6217989Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:43.6251119Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:43.6298518Z Entering 'third_party/pocketfft' 2025-09-07T07:39:43.6329299Z Entering 'third_party/protobuf' 2025-09-07T07:39:43.6362449Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:43.6394807Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:43.6429197Z Entering 'third_party/psimd' 2025-09-07T07:39:43.6461065Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:43.6496731Z Entering 'third_party/pybind11' 2025-09-07T07:39:43.6530256Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:43.6561043Z Entering 'third_party/sleef' 2025-09-07T07:39:43.6596141Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:43.6626220Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:43.6655431Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:43.6686849Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:43.6717059Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:43.6749824Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:43.6797174Z ##[endgroup] 2025-09-07T07:39:43.6827476Z [command]/usr/bin/git log -1 --format=%H 2025-09-07T07:39:43.6846997Z 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:39:43.6926319Z ##[group]Run cd "${GITHUB_WORKSPACE}" 2025-09-07T07:39:43.6926587Z cd "${GITHUB_WORKSPACE}" 2025-09-07T07:39:43.6926825Z # Clean stale submodule dirs 2025-09-07T07:39:43.6927051Z if [ -z "${NO_SUDO}" ]; then 2025-09-07T07:39:43.6927312Z  sudo git submodule foreach --recursive git clean -ffdx 2025-09-07T07:39:43.6927570Z else 2025-09-07T07:39:43.6927789Z  git submodule foreach --recursive git clean -ffdx 2025-09-07T07:39:43.6928033Z fi 2025-09-07T07:39:43.6936791Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:43.6937053Z env: 2025-09-07T07:39:43.6937227Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:43.6937422Z NO_SUDO: true 2025-09-07T07:39:43.6937586Z ##[endgroup] 2025-09-07T07:39:43.7207891Z Entering 'android/libs/fbjni' 2025-09-07T07:39:43.7233259Z Entering 'third_party/FP16' 2025-09-07T07:39:43.7259605Z Entering 'third_party/FXdiv' 2025-09-07T07:39:43.7287640Z Entering 'third_party/NNPACK' 2025-09-07T07:39:43.7313297Z Entering 'third_party/NVTX' 2025-09-07T07:39:43.7341740Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T07:39:43.7369885Z Entering 'third_party/XNNPACK' 2025-09-07T07:39:43.7459891Z Entering 'third_party/aiter' 2025-09-07T07:39:43.7490429Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T07:39:43.7569379Z Entering 'third_party/benchmark' 2025-09-07T07:39:43.7595828Z Entering 'third_party/composable_kernel' 2025-09-07T07:39:43.7679284Z Entering 'third_party/cpp-httplib' 2025-09-07T07:39:43.7705991Z Entering 'third_party/cpuinfo' 2025-09-07T07:39:43.7732223Z Entering 'third_party/cudnn_frontend' 2025-09-07T07:39:43.7760144Z Entering 'third_party/cutlass' 2025-09-07T07:39:43.7838741Z Entering 'third_party/fbgemm' 2025-09-07T07:39:43.7884570Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T07:39:43.7909267Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T07:39:43.7987620Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T07:39:43.8014999Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T07:39:43.8088948Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T07:39:43.8113706Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T07:39:43.8137615Z Entering 'third_party/fbgemm/external/json' 2025-09-07T07:39:43.8172998Z Entering 'third_party/flash-attention' 2025-09-07T07:39:43.8206325Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T07:39:43.8280407Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T07:39:43.8342137Z Entering 'third_party/flatbuffers' 2025-09-07T07:39:43.8394376Z Entering 'third_party/fmt' 2025-09-07T07:39:43.8420821Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T07:39:43.8446484Z Entering 'third_party/gloo' 2025-09-07T07:39:43.8472509Z Entering 'third_party/googletest' 2025-09-07T07:39:43.8501207Z Entering 'third_party/ideep' 2025-09-07T07:39:43.8525626Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T07:39:43.8585195Z Entering 'third_party/ittapi' 2025-09-07T07:39:43.8611470Z Entering 'third_party/kineto' 2025-09-07T07:39:43.8639451Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T07:39:43.8667749Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T07:39:43.8703805Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T07:39:43.8729037Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T07:39:43.8756985Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T07:39:43.8780637Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T07:39:43.8804712Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T07:39:43.8825288Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T07:39:43.8850858Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T07:39:43.8884509Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T07:39:43.8907258Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T07:39:43.8931270Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T07:39:43.8958667Z Entering 'third_party/kleidiai' 2025-09-07T07:39:43.8989005Z Entering 'third_party/mimalloc' 2025-09-07T07:39:43.9014601Z Entering 'third_party/nlohmann' 2025-09-07T07:39:43.9047432Z Entering 'third_party/onnx' 2025-09-07T07:39:43.9256572Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T07:39:43.9281702Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T07:39:43.9318641Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T07:39:43.9343387Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T07:39:43.9370152Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T07:39:43.9394216Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T07:39:43.9425161Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T07:39:43.9451599Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T07:39:43.9476910Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T07:39:43.9500832Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T07:39:43.9535680Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T07:39:43.9561347Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T07:39:43.9737136Z Entering 'third_party/pocketfft' 2025-09-07T07:39:43.9758796Z Entering 'third_party/protobuf' 2025-09-07T07:39:43.9811947Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T07:39:43.9835532Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T07:39:43.9865856Z Entering 'third_party/psimd' 2025-09-07T07:39:43.9891066Z Entering 'third_party/pthreadpool' 2025-09-07T07:39:43.9914653Z Entering 'third_party/pybind11' 2025-09-07T07:39:43.9940682Z Entering 'third_party/python-peachpy' 2025-09-07T07:39:43.9967687Z Entering 'third_party/sleef' 2025-09-07T07:39:43.9992666Z Entering 'third_party/tensorpipe' 2025-09-07T07:39:44.0019404Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T07:39:44.0044580Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T07:39:44.0070982Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T07:39:44.0097424Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T07:39:44.0121250Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T07:39:44.0235725Z Prepare all required actions 2025-09-07T07:39:44.0236125Z Getting action download info 2025-09-07T07:39:44.1517141Z ##[group]Run ./.github/actions/setup-linux 2025-09-07T07:39:44.1517376Z env: 2025-09-07T07:39:44.1517550Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:44.1517735Z ##[endgroup] 2025-09-07T07:39:44.1546217Z ##[group]Run set -euo pipefail 2025-09-07T07:39:44.1546486Z set -euo pipefail 2025-09-07T07:39:44.1546853Z function get_ec2_metadata() { 2025-09-07T07:39:44.1547117Z  # Pulled from instance metadata endpoint for EC2 2025-09-07T07:39:44.1547558Z  # see https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-retrieval.html 2025-09-07T07:39:44.1547944Z  category=$1 2025-09-07T07:39:44.1548207Z  # If it is GCP runner (runner name contains gcp), do not run this 2025-09-07T07:39:44.1548499Z  runner_name_str=i-081e6be8c4291059d 2025-09-07T07:39:44.1548787Z  if [[ -f /.inarc ]]; then 2025-09-07T07:39:44.1549038Z  echo "ARC Runner, no info on ec2 metadata" 2025-09-07T07:39:44.1549301Z  elif [[ $runner_name_str == *"gcp"* ]]; then 2025-09-07T07:39:44.1549612Z  echo "Runner is from Google Cloud Platform, No info on ec2 metadata" 2025-09-07T07:39:44.1549891Z  else 2025-09-07T07:39:44.1550463Z  curl -H "X-aws-ec2-metadata-token: $(curl -s -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")" -fsSL "http://169.254.169.254/latest/meta-data/${category}" 2025-09-07T07:39:44.1551048Z  fi 2025-09-07T07:39:44.1551206Z } 2025-09-07T07:39:44.1551408Z echo "ami-id: $(get_ec2_metadata ami-id)" 2025-09-07T07:39:44.1551703Z echo "instance-id: $(get_ec2_metadata instance-id)" 2025-09-07T07:39:44.1552023Z echo "instance-type: $(get_ec2_metadata instance-type)" 2025-09-07T07:39:44.1552307Z echo "system info $(uname -a)" 2025-09-07T07:39:44.1557660Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:44.1557927Z env: 2025-09-07T07:39:44.1558102Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:44.1558288Z ##[endgroup] 2025-09-07T07:39:44.1670418Z ami-id: ami-05ffe3c48a9991133 2025-09-07T07:39:44.1746597Z instance-id: i-081e6be8c4291059d 2025-09-07T07:39:44.1816183Z instance-type: c7i.metal-24xl 2025-09-07T07:39:44.1826073Z system info Linux ip-10-0-37-56.ec2.internal 6.1.141-155.222.amzn2023.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Jun 17 10:29:47 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux 2025-09-07T07:39:44.1857777Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:39:44.1858307Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:39:44.1862523Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:44.1862776Z env: 2025-09-07T07:39:44.1862929Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:44.1863099Z ##[endgroup] 2025-09-07T07:39:44.1898057Z ##[group]Run if systemctl is-active --quiet docker; then 2025-09-07T07:39:44.1898369Z if systemctl is-active --quiet docker; then 2025-09-07T07:39:44.1898627Z  echo "Docker daemon is running..."; 2025-09-07T07:39:44.1898850Z else 2025-09-07T07:39:44.1899100Z  echo "Starting docker daemon..." && sudo systemctl start docker; 2025-09-07T07:39:44.1899390Z fi 2025-09-07T07:39:44.1903967Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:44.1904224Z env: 2025-09-07T07:39:44.1904401Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:44.1904593Z ##[endgroup] 2025-09-07T07:39:44.1968724Z Docker daemon is running... 2025-09-07T07:39:44.1993619Z ##[group]Run nick-fields/retry@v3.0.0 2025-09-07T07:39:44.1993821Z with: 2025-09-07T07:39:44.1993954Z shell: bash 2025-09-07T07:39:44.1994195Z timeout_minutes: 5 2025-09-07T07:39:44.1994361Z max_attempts: 3 2025-09-07T07:39:44.1994520Z retry_wait_seconds: 30 2025-09-07T07:39:44.1995720Z command: AWS_ACCOUNT_ID=$(aws sts get-caller-identity|grep Account|cut -f4 -d\") aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" # For LF Runners we need to make sure we also login to Meta's ECR docker registry too. META_AWS_ACCOUNT_ID=308535385114 if [ "$AWS_ACCOUNT_ID" != "$META_AWS_ACCOUNT_ID" ] ; then aws ecr get-login-password --region "$AWS_DEFAULT_REGION" | docker login --username AWS \ --password-stdin "$META_AWS_ACCOUNT_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com" fi 2025-09-07T07:39:44.1996956Z polling_interval_seconds: 1 2025-09-07T07:39:44.1997132Z warning_on_retry: true 2025-09-07T07:39:44.1997291Z continue_on_error: false 2025-09-07T07:39:44.1997452Z env: 2025-09-07T07:39:44.1997593Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:44.1997760Z AWS_RETRY_MODE: standard 2025-09-07T07:39:44.1997913Z AWS_MAX_ATTEMPTS: 5 2025-09-07T07:39:44.1998084Z AWS_DEFAULT_REGION: us-east-1 2025-09-07T07:39:44.1998262Z ##[endgroup] 2025-09-07T07:39:45.0002616Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-09-07T07:39:45.0003110Z Configure a credential helper to remove this warning. See 2025-09-07T07:39:45.0003513Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-09-07T07:39:45.0003776Z 2025-09-07T07:39:45.0003955Z Login Succeeded 2025-09-07T07:39:45.2628732Z Command completed after 1 attempt(s). 2025-09-07T07:39:45.2668765Z ##[group]Run env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:39:45.2669085Z env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:39:45.2669353Z env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:39:45.2674652Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:45.2674875Z env: 2025-09-07T07:39:45.2675026Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.2675200Z ##[endgroup] 2025-09-07T07:39:45.2731049Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-09-07T07:39:45.2731403Z # ignore expansion of "docker ps -q" since it could be empty 2025-09-07T07:39:45.2731673Z # shellcheck disable=SC2046 2025-09-07T07:39:45.2731905Z docker stop $(docker ps -q) || true 2025-09-07T07:39:45.2732129Z # Prune all of the docker images 2025-09-07T07:39:45.2732331Z docker system prune -af 2025-09-07T07:39:45.2736793Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:45.2737030Z env: 2025-09-07T07:39:45.2737182Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.2737354Z ##[endgroup] 2025-09-07T07:39:45.3211617Z "docker stop" requires at least 1 argument. 2025-09-07T07:39:45.3211999Z See 'docker stop --help'. 2025-09-07T07:39:45.3212177Z 2025-09-07T07:39:45.3212308Z Usage: docker stop [OPTIONS] CONTAINER [CONTAINER...] 2025-09-07T07:39:45.3212481Z 2025-09-07T07:39:45.3212595Z Stop one or more running containers 2025-09-07T07:39:45.3367078Z Total reclaimed space: 0B 2025-09-07T07:39:45.3386540Z ##[group]Run set +e 2025-09-07T07:39:45.3386740Z set +e 2025-09-07T07:39:45.3386907Z set -x 2025-09-07T07:39:45.3387065Z  2025-09-07T07:39:45.3387244Z PT_DOMAIN=download.pytorch.org 2025-09-07T07:39:45.3387612Z # TODO: Flaky access to download.pytorch.org https://github.com/pytorch/pytorch/issues/100400, 2025-09-07T07:39:45.3388046Z # cleaning this up once the issue is fixed. There are more than one resolved IP here, the last 2025-09-07T07:39:45.3388354Z # one is returned at random 2025-09-07T07:39:45.3388617Z RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" | tail -n1) 2025-09-07T07:39:45.3388848Z  2025-09-07T07:39:45.3389110Z if [ -z "${RESOLVED_IP}" ]; then 2025-09-07T07:39:45.3389384Z  echo "Couldn't resolve ${PT_DOMAIN}, retrying with Google DNS..." 2025-09-07T07:39:45.3389701Z  RESOLVED_IP=$(dig -4 +short "${PT_DOMAIN}" @8.8.8.8 | tail -n1) 2025-09-07T07:39:45.3389935Z  2025-09-07T07:39:45.3390088Z  if [ -z "${RESOLVED_IP}" ]; then 2025-09-07T07:39:45.3390335Z  echo "Couldn't resolve ${PT_DOMAIN}, exiting..." 2025-09-07T07:39:45.3390650Z  exit 1 2025-09-07T07:39:45.3390809Z  fi 2025-09-07T07:39:45.3390949Z fi 2025-09-07T07:39:45.3391090Z  2025-09-07T07:39:45.3391264Z if grep -r "${PT_DOMAIN}" /etc/hosts; then 2025-09-07T07:39:45.3391490Z  # Clean up any old records first 2025-09-07T07:39:45.3391706Z  sudo sed -i "/${PT_DOMAIN}/d" /etc/hosts 2025-09-07T07:39:45.3391910Z fi 2025-09-07T07:39:45.3392051Z  2025-09-07T07:39:45.3392253Z echo "${RESOLVED_IP} ${PT_DOMAIN}" | sudo tee -a /etc/hosts 2025-09-07T07:39:45.3392492Z cat /etc/hosts 2025-09-07T07:39:45.3396462Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:45.3396702Z env: 2025-09-07T07:39:45.3396868Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.3397041Z ##[endgroup] 2025-09-07T07:39:45.3412638Z + PT_DOMAIN=download.pytorch.org 2025-09-07T07:39:45.3417024Z ++ dig -4 +short download.pytorch.org 2025-09-07T07:39:45.3418754Z ++ tail -n1 2025-09-07T07:39:45.3878004Z + RESOLVED_IP=18.160.10.76 2025-09-07T07:39:45.3878341Z + '[' -z 18.160.10.76 ']' 2025-09-07T07:39:45.3878582Z + grep -r download.pytorch.org /etc/hosts 2025-09-07T07:39:45.3889211Z + echo '18.160.10.76 download.pytorch.org' 2025-09-07T07:39:45.3890536Z + sudo tee -a /etc/hosts 2025-09-07T07:39:45.7591203Z 18.160.10.76 download.pytorch.org 2025-09-07T07:39:45.7602626Z + cat /etc/hosts 2025-09-07T07:39:45.7608114Z 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 2025-09-07T07:39:45.7612557Z ::1 localhost6 localhost6.localdomain6 2025-09-07T07:39:45.7612807Z 18.160.10.76 download.pytorch.org 2025-09-07T07:39:45.7721274Z ##[group]Run pytorch/test-infra/.github/actions/calculate-docker-image@main 2025-09-07T07:39:45.7721556Z with: 2025-09-07T07:39:45.7722062Z docker-image-name: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7722625Z use-custom-docker-registry: true 2025-09-07T07:39:45.7722823Z docker-build-dir: .ci/docker 2025-09-07T07:39:45.7723021Z docker-build-script: ./build.sh 2025-09-07T07:39:45.7723215Z working-directory: . 2025-09-07T07:39:45.7723440Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7723673Z force-push: false 2025-09-07T07:39:45.7723827Z env: 2025-09-07T07:39:45.7723970Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.7724143Z ##[endgroup] 2025-09-07T07:39:45.7739249Z ##[group]Run set -ex 2025-09-07T07:39:45.7739464Z set -ex 2025-09-07T07:39:45.7739616Z  2025-09-07T07:39:45.7739900Z # If the docker build directory or the build script doesn't exist, the action will 2025-09-07T07:39:45.7740279Z # gracefully return the docker image name as it is. Pulling docker image in Linux 2025-09-07T07:39:45.7740607Z # job could then download the pre-built image as usual 2025-09-07T07:39:45.7740993Z if [[ -d "${DOCKER_BUILD_DIR}" ]] && [[ -f "${DOCKER_BUILD_DIR}/${DOCKER_BUILD_SCRIPT}" ]] && [[ "${USE_CUSTOM_DOCKER_REGISTRY}" == "true" ]]; then 2025-09-07T07:39:45.7741375Z  echo "skip=false" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7741575Z else 2025-09-07T07:39:45.7741749Z  echo "skip=true" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7742009Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7742249Z  2025-09-07T07:39:45.7742578Z  echo "Not using custom ECR registry. Either it was not requested or there is no Docker build script in the ${REPO_NAME} repo..." 2025-09-07T07:39:45.7742924Z  exit 0 2025-09-07T07:39:45.7743064Z fi 2025-09-07T07:39:45.7743201Z  2025-09-07T07:39:45.7743413Z if [[ "${DOCKER_IMAGE_NAME}" == *"${DOCKER_REGISTRY}/${REPO_NAME}"* ]]; then 2025-09-07T07:39:45.7743753Z  # The docker image name already includes the ECR prefix and tag, so we can just 2025-09-07T07:39:45.7744154Z  # use it as it is, but first let's extract the tag 2025-09-07T07:39:45.7744436Z  DOCKER_TAG=$(echo "${DOCKER_IMAGE_NAME}" | awk -F '[:,]' '{print $2}') 2025-09-07T07:39:45.7744737Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7745023Z  echo "docker-image=${DOCKER_IMAGE_NAME}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7745261Z else 2025-09-07T07:39:45.7745441Z  if [[ "${DOCKER_IMAGE_NAME}" == *:* ]]; then 2025-09-07T07:39:45.7745671Z  CUSTOM_TAG_PREFIX=${DOCKER_IMAGE_NAME#*:} 2025-09-07T07:39:45.7745933Z  DOCKER_IMAGE_NAME=${DOCKER_IMAGE_NAME%%:*} 2025-09-07T07:39:45.7746146Z  fi 2025-09-07T07:39:45.7746417Z  DOCKER_TAG=${CUSTOM_TAG_PREFIX:+${CUSTOM_TAG_PREFIX}-}$(git rev-parse HEAD:"${DOCKER_BUILD_DIR}") 2025-09-07T07:39:45.7746771Z  echo "docker-tag=${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7747136Z  echo "docker-image=${DOCKER_REGISTRY}/${REPO_NAME}/${DOCKER_IMAGE_NAME}:${DOCKER_TAG}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7747530Z  echo "custom-tag-prefix=${CUSTOM_TAG_PREFIX}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7747782Z fi 2025-09-07T07:39:45.7752675Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:45.7752941Z env: 2025-09-07T07:39:45.7753104Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.7753294Z REPO_NAME: pytorch 2025-09-07T07:39:45.7753961Z DOCKER_IMAGE_NAME: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7754523Z DOCKER_BUILD_DIR: .ci/docker 2025-09-07T07:39:45.7754721Z DOCKER_BUILD_SCRIPT: ./build.sh 2025-09-07T07:39:45.7754971Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7755236Z USE_CUSTOM_DOCKER_REGISTRY: true 2025-09-07T07:39:45.7755423Z CUSTOM_TAG_PREFIX: 2025-09-07T07:39:45.7755594Z ##[endgroup] 2025-09-07T07:39:45.7773592Z + [[ -d .ci/docker ]] 2025-09-07T07:39:45.7773835Z + [[ -f .ci/docker/./build.sh ]] 2025-09-07T07:39:45.7774044Z + [[ true == \t\r\u\e ]] 2025-09-07T07:39:45.7774325Z + echo skip=false 2025-09-07T07:39:45.7775049Z + [[ 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 == *\3\0\8\5\3\5\3\8\5\1\1\4\.\d\k\r\.\e\c\r\.\u\s\-\e\a\s\t\-\1\.\a\m\a\z\o\n\a\w\s\.\c\o\m\/\p\y\t\o\r\c\h* ]] 2025-09-07T07:39:45.7780702Z ++ echo 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7781454Z ++ awk -F '[:,]' '{print $2}' 2025-09-07T07:39:45.7797506Z + DOCKER_TAG=pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7798152Z + echo docker-tag=pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7798904Z + echo docker-image=308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7842165Z ##[group]Run set +e 2025-09-07T07:39:45.7842384Z set +e 2025-09-07T07:39:45.7842542Z set -x 2025-09-07T07:39:45.7842693Z  2025-09-07T07:39:45.7842834Z login() { 2025-09-07T07:39:45.7843140Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-09-07T07:39:45.7843462Z } 2025-09-07T07:39:45.7843608Z  2025-09-07T07:39:45.7843740Z retry () { 2025-09-07T07:39:45.7843926Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-09-07T07:39:45.7844124Z } 2025-09-07T07:39:45.7844266Z  2025-09-07T07:39:45.7844414Z retry login "${DOCKER_REGISTRY}" 2025-09-07T07:39:45.7844704Z  2025-09-07T07:39:45.7844856Z START_TIME=$(date +%s) 2025-09-07T07:39:45.7845054Z # Wait up to 120 minutes 2025-09-07T07:39:45.7845291Z while [[ $(( $(date +%s) - 7200 )) -lt $START_TIME ]]; do 2025-09-07T07:39:45.7845584Z  # Check if image already exists, if it does then skip building it 2025-09-07T07:39:45.7845884Z  if docker manifest inspect "${DOCKER_IMAGE}"; then 2025-09-07T07:39:45.7846116Z  exit 0 2025-09-07T07:39:45.7846274Z  fi 2025-09-07T07:39:45.7846414Z  2025-09-07T07:39:45.7846660Z  # NB: This flag is used by Docker build workflow to push the image to ECR, so we can 2025-09-07T07:39:45.7847040Z  # use this to differentiate between the Docker build and regular build jobs. For the 2025-09-07T07:39:45.7847420Z  # latter, it will wait for the Docker images to become available before continuing 2025-09-07T07:39:45.7847733Z  if [ "${DOCKER_PUSH:-false}" == "true" ]; then 2025-09-07T07:39:45.7847982Z  # It's a Docker build job, let's build the image 2025-09-07T07:39:45.7848193Z  break 2025-09-07T07:39:45.7848350Z  else 2025-09-07T07:39:45.7848571Z  # It's a regular build job, wait for the image to become available 2025-09-07T07:39:45.7848814Z  sleep 300 2025-09-07T07:39:45.7848980Z  fi 2025-09-07T07:39:45.7849131Z done 2025-09-07T07:39:45.7849271Z  2025-09-07T07:39:45.7849485Z # NB: This part requires a full checkout. Otherwise, the merge base will 2025-09-07T07:39:45.7849909Z # be empty. The default action would be to continue rebuild the image 2025-09-07T07:39:45.7850217Z if [[ "$BASE_REVISION" = "$(git rev-parse HEAD)" ]]; then 2025-09-07T07:39:45.7850490Z  # if we're on the base branch then use the parent commit 2025-09-07T07:39:45.7850738Z  MERGE_BASE=$(git rev-parse HEAD~) 2025-09-07T07:39:45.7850930Z else 2025-09-07T07:39:45.7851143Z  # otherwise we're on a PR, so use the most recent base commit 2025-09-07T07:39:45.7851434Z  MERGE_BASE=$(git merge-base HEAD "$BASE_REVISION") 2025-09-07T07:39:45.7851650Z fi 2025-09-07T07:39:45.7851784Z  2025-09-07T07:39:45.7851942Z if [[ -z "${MERGE_BASE}" ]]; then 2025-09-07T07:39:45.7852172Z  echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7852378Z  2025-09-07T07:39:45.7852649Z  echo "Finding merge base only works with full checkout, please set fetch-depth to 0, continuing ..." 2025-09-07T07:39:45.7852963Z  exit 0 2025-09-07T07:39:45.7853114Z fi 2025-09-07T07:39:45.7853256Z  2025-09-07T07:39:45.7853454Z if ! git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}"; then 2025-09-07T07:39:45.7853835Z  echo "Directory '${DOCKER_BUILD_DIR}' not found in commit $MERGE_BASE, you should rebase onto a more recent commit" 2025-09-07T07:39:45.7854171Z  exit 1 2025-09-07T07:39:45.7854323Z fi 2025-09-07T07:39:45.7854463Z  2025-09-07T07:39:45.7854684Z PREVIOUS_DOCKER_TAG=$(git rev-parse "${MERGE_BASE}:${DOCKER_BUILD_DIR}") 2025-09-07T07:39:45.7855069Z # If no image exists but the hash is the same as the previous hash then we should error out here 2025-09-07T07:39:45.7855417Z if [[ "${PREVIOUS_DOCKER_TAG}" == "${DOCKER_TAG}" ]]; then 2025-09-07T07:39:45.7855812Z  echo "WARNING: Something has gone wrong and the previous image isn't available for the merge-base of your branch" 2025-09-07T07:39:45.7856251Z  echo " Will re-build docker image to store in local cache, TTS may be longer" 2025-09-07T07:39:45.7856511Z fi 2025-09-07T07:39:45.7856650Z  2025-09-07T07:39:45.7856826Z echo "rebuild=true" >> "${GITHUB_OUTPUT}" 2025-09-07T07:39:45.7860576Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:45.7860874Z env: 2025-09-07T07:39:45.7861025Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:45.7861212Z DOCKER_BUILD_DIR: .ci/docker 2025-09-07T07:39:45.7861443Z BASE_REVISION: 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:39:45.7862010Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7862720Z DOCKER_TAG: pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:45.7863167Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7863412Z DOCKER_PUSH: 2025-09-07T07:39:45.7863571Z ##[endgroup] 2025-09-07T07:39:45.7883481Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7883967Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7886748Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:45.7887285Z + aws ecr get-login-password --region us-east-1 2025-09-07T07:39:46.1659811Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-09-07T07:39:46.1660339Z Configure a credential helper to remove this warning. See 2025-09-07T07:39:46.1660587Z Login Succeeded 2025-09-07T07:39:46.1660888Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-09-07T07:39:46.1661108Z 2025-09-07T07:39:46.1674211Z ++ date +%s 2025-09-07T07:39:46.1681478Z + START_TIME=1757230786 2025-09-07T07:39:46.1683959Z ++ date +%s 2025-09-07T07:39:46.1690858Z + [[ 1757223586 -lt 1757230786 ]] 2025-09-07T07:39:46.1691449Z + docker manifest inspect 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:46.3606524Z { 2025-09-07T07:39:46.3606750Z "schemaVersion": 2, 2025-09-07T07:39:46.3607063Z "mediaType": "application/vnd.docker.distribution.manifest.v2+json", 2025-09-07T07:39:46.3607341Z "config": { 2025-09-07T07:39:46.3607562Z "mediaType": "application/vnd.docker.container.image.v1+json", 2025-09-07T07:39:46.3607808Z "size": 30269, 2025-09-07T07:39:46.3608071Z "digest": "sha256:662d8c9dfc7db2f5d004293de4f2b7647941dee4c916479ef082d17fcdfd9c47" 2025-09-07T07:39:46.3608346Z }, 2025-09-07T07:39:46.3608480Z "layers": [ 2025-09-07T07:39:46.3608621Z { 2025-09-07T07:39:46.3608822Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3609069Z "size": 30448359, 2025-09-07T07:39:46.3609342Z "digest": "sha256:e6fdc8487bfe6d764301ef3634bc6c043841dc3ab05ca14f81e69c0f92562d46" 2025-09-07T07:39:46.3609631Z }, 2025-09-07T07:39:46.3609761Z { 2025-09-07T07:39:46.3609971Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3610225Z "size": 1554, 2025-09-07T07:39:46.3610484Z "digest": "sha256:18a5ee5b0e2e283bf6d7b9c4c312b0448c75eff1c43446c22c5139a3aeec97fe" 2025-09-07T07:39:46.3610759Z }, 2025-09-07T07:39:46.3610872Z { 2025-09-07T07:39:46.3611065Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3611308Z "size": 313297813, 2025-09-07T07:39:46.3611565Z "digest": "sha256:572424b92528ee46c84fdf3e9e1f5fd75e302621ad75dcf4257ad06778885094" 2025-09-07T07:39:46.3611827Z }, 2025-09-07T07:39:46.3611947Z { 2025-09-07T07:39:46.3612143Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3612377Z "size": 793, 2025-09-07T07:39:46.3612622Z "digest": "sha256:1c35b7d4b67c6769f59f96a643d69c214c5b00291a4968cdd395eedbce82b9c0" 2025-09-07T07:39:46.3612886Z }, 2025-09-07T07:39:46.3613010Z { 2025-09-07T07:39:46.3613209Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3613443Z "size": 106, 2025-09-07T07:39:46.3613701Z "digest": "sha256:68c20f3c23bb0bddb9b69e6ce2e45bcd5b1fcfd9b37dbe3de26b8a5f0e81ff13" 2025-09-07T07:39:46.3614185Z }, 2025-09-07T07:39:46.3614309Z { 2025-09-07T07:39:46.3614500Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3614746Z "size": 704, 2025-09-07T07:39:46.3614993Z "digest": "sha256:7efa39950d3273a15b20bc5f6659373b2b4eb62e36328d96b289834c48d2e408" 2025-09-07T07:39:46.3615260Z }, 2025-09-07T07:39:46.3615380Z { 2025-09-07T07:39:46.3638346Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3638662Z "size": 1214, 2025-09-07T07:39:46.3638947Z "digest": "sha256:a10eb16a7271e996ea9f1d769ba6bd2ec69358f2a79cf26649595a8cea38275f" 2025-09-07T07:39:46.3639345Z + exit 0 2025-09-07T07:39:46.3639501Z }, 2025-09-07T07:39:46.3639628Z { 2025-09-07T07:39:46.3639855Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3640106Z "size": 485, 2025-09-07T07:39:46.3640373Z "digest": "sha256:7d52cf57965449440c17f257fe4c522f9685019961eaa9853d7c820cfe39f5cc" 2025-09-07T07:39:46.3640644Z }, 2025-09-07T07:39:46.3640775Z { 2025-09-07T07:39:46.3640984Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3641233Z "size": 110343705, 2025-09-07T07:39:46.3641495Z "digest": "sha256:cb6a20fcf4e24ec2e1f72ecf361b26e058f3e6194947a9b3a25312223d43516e" 2025-09-07T07:39:46.3641776Z }, 2025-09-07T07:39:46.3641907Z { 2025-09-07T07:39:46.3642120Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3642361Z "size": 4787, 2025-09-07T07:39:46.3642616Z "digest": "sha256:46fb6a8b3e1d4eac9b3a21577824410003ed38f194b4b1486b747e324b32ef6a" 2025-09-07T07:39:46.3643005Z }, 2025-09-07T07:39:46.3643136Z { 2025-09-07T07:39:46.3643333Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3643578Z "size": 1709, 2025-09-07T07:39:46.3643839Z "digest": "sha256:5ad6977cc38e4ea8a6545d6a4fc0e2fdde705a7af96eb496cfe20f264fbc1e74" 2025-09-07T07:39:46.3644121Z }, 2025-09-07T07:39:46.3644243Z { 2025-09-07T07:39:46.3644443Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3644688Z "size": 724, 2025-09-07T07:39:46.3644933Z "digest": "sha256:da63046995a2e510b7146776371a14bff4b31002cc3ef0322e45a3932fba2031" 2025-09-07T07:39:46.3645192Z }, 2025-09-07T07:39:46.3645321Z { 2025-09-07T07:39:46.3645517Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3645766Z "size": 543, 2025-09-07T07:39:46.3646014Z "digest": "sha256:78243fdb9906cb588921ddaa67a3ca915aa9447ca675faac1a9ebc420a561d83" 2025-09-07T07:39:46.3646289Z }, 2025-09-07T07:39:46.3646419Z { 2025-09-07T07:39:46.3646617Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3646854Z "size": 3395447162, 2025-09-07T07:39:46.3647118Z "digest": "sha256:6f70d5d50abaab8988f460b5590d92b6d1d340575ddee981662c24034d7d20af" 2025-09-07T07:39:46.3647392Z }, 2025-09-07T07:39:46.3647522Z { 2025-09-07T07:39:46.3647716Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3647959Z "size": 32, 2025-09-07T07:39:46.3648214Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3648489Z }, 2025-09-07T07:39:46.3648604Z { 2025-09-07T07:39:46.3648802Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3649044Z "size": 380, 2025-09-07T07:39:46.3649293Z "digest": "sha256:69715d3ad3c493436abde51f5a575e79f7d55b46c653f5607f3c7722ad9a05db" 2025-09-07T07:39:46.3649566Z }, 2025-09-07T07:39:46.3649683Z { 2025-09-07T07:39:46.3649887Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3650133Z "size": 235844, 2025-09-07T07:39:46.3650377Z "digest": "sha256:7ace90c063f3f3ce8f04b541afe935088868930e5c074824af2b2c327779a3b5" 2025-09-07T07:39:46.3650651Z }, 2025-09-07T07:39:46.3650779Z { 2025-09-07T07:39:46.3650982Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3652040Z "size": 230, 2025-09-07T07:39:46.3652292Z "digest": "sha256:acbd5447dd1406dab8e46234f6a034a75ad9794f76c24f817b0ecf28b6a69c78" 2025-09-07T07:39:46.3652570Z }, 2025-09-07T07:39:46.3652703Z { 2025-09-07T07:39:46.3652896Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3653142Z "size": 3396092, 2025-09-07T07:39:46.3653396Z "digest": "sha256:744523d9b7f5a3e7abfc646c2d5222e7379024242430b93cb4b8093574e69022" 2025-09-07T07:39:46.3653665Z }, 2025-09-07T07:39:46.3653794Z { 2025-09-07T07:39:46.3653986Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3654224Z "size": 1477, 2025-09-07T07:39:46.3654470Z "digest": "sha256:5bd615a7b945084e11bcb40190f9d6e50367297237146df7b008fa8c668f29c8" 2025-09-07T07:39:46.3654730Z }, 2025-09-07T07:39:46.3654848Z { 2025-09-07T07:39:46.3655046Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3655288Z "size": 482, 2025-09-07T07:39:46.3655540Z "digest": "sha256:f4986a00e3aecf1d56beaada7aba8c49fbb3683db3c99790ab0aa4caaa34f76f" 2025-09-07T07:39:46.3655807Z }, 2025-09-07T07:39:46.3655933Z { 2025-09-07T07:39:46.3656131Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3656372Z "size": 196, 2025-09-07T07:39:46.3656610Z "digest": "sha256:21902f6e4f8cb76c82e755b8fc9f72e1912bf925ab345ab5b4cc2210f4887a64" 2025-09-07T07:39:46.3656879Z }, 2025-09-07T07:39:46.3657005Z { 2025-09-07T07:39:46.3657204Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3657475Z "size": 608, 2025-09-07T07:39:46.3657729Z "digest": "sha256:d80602abf3ccf0c0b527848a403dfde36e1cf1db1416852385feda5c44bf4363" 2025-09-07T07:39:46.3658004Z }, 2025-09-07T07:39:46.3658129Z { 2025-09-07T07:39:46.3658319Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3658559Z "size": 226, 2025-09-07T07:39:46.3658813Z "digest": "sha256:3c51bf0bc362d34a17911f73c5146cbd668c4d1cf1b944cbf40a604d71cd623a" 2025-09-07T07:39:46.3659088Z }, 2025-09-07T07:39:46.3659204Z { 2025-09-07T07:39:46.3659405Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3659652Z "size": 828, 2025-09-07T07:39:46.3659902Z "digest": "sha256:119ab3bceafa6f2cab4b1f71161195139792990263ee8de82230c6284f0ae20a" 2025-09-07T07:39:46.3660169Z }, 2025-09-07T07:39:46.3660295Z { 2025-09-07T07:39:46.3660490Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3660731Z "size": 32, 2025-09-07T07:39:46.3660979Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3661260Z }, 2025-09-07T07:39:46.3661388Z { 2025-09-07T07:39:46.3661588Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3661822Z "size": 104, 2025-09-07T07:39:46.3662073Z "digest": "sha256:af8eadc9eaabdaf6c5e01031d63061605327153e07568ddd159966ecea75cd07" 2025-09-07T07:39:46.3662352Z }, 2025-09-07T07:39:46.3662480Z { 2025-09-07T07:39:46.3662673Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3662920Z "size": 1495, 2025-09-07T07:39:46.3663171Z "digest": "sha256:e7769b0d7a8262f3cc32a9d96080de5318dac3d2617e10508a167e689016e40c" 2025-09-07T07:39:46.3663442Z }, 2025-09-07T07:39:46.3663562Z { 2025-09-07T07:39:46.3663764Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3664008Z "size": 453908015, 2025-09-07T07:39:46.3664271Z "digest": "sha256:ba263639b0f4634277ef3b8903e3457ac27ce012f1bbeeeeb773191c2c3b222b" 2025-09-07T07:39:46.3664538Z }, 2025-09-07T07:39:46.3664662Z { 2025-09-07T07:39:46.3664858Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3665102Z "size": 164, 2025-09-07T07:39:46.3665341Z "digest": "sha256:a5ab7a280382a797dd5ba6a6716f667a231540ad1e0e7c8ba48bb24d5ab80ef0" 2025-09-07T07:39:46.3665655Z }, 2025-09-07T07:39:46.3665783Z { 2025-09-07T07:39:46.3665982Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3666215Z "size": 346, 2025-09-07T07:39:46.3666463Z "digest": "sha256:80b2232d952f55c3662cffd657ba30fe825f08dfcc5bbea13e2bc6de4482b7e4" 2025-09-07T07:39:46.3666739Z }, 2025-09-07T07:39:46.3666863Z { 2025-09-07T07:39:46.3667051Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3667292Z "size": 32, 2025-09-07T07:39:46.3667547Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3667819Z }, 2025-09-07T07:39:46.3667937Z { 2025-09-07T07:39:46.3668139Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3668380Z "size": 106, 2025-09-07T07:39:46.3668626Z "digest": "sha256:cc93cd65e90f0a9c50194579c93e96897f4e582b9777a1c4d7df7b913ddcdded" 2025-09-07T07:39:46.3668894Z }, 2025-09-07T07:39:46.3669023Z { 2025-09-07T07:39:46.3669222Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3669461Z "size": 425, 2025-09-07T07:39:46.3669702Z "digest": "sha256:0eed4c15712bc470dac7df87e33b3570a1510344019dd9cc0e95b8beb1f98372" 2025-09-07T07:39:46.3669975Z }, 2025-09-07T07:39:46.3670098Z { 2025-09-07T07:39:46.3670299Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3670534Z "size": 19309387, 2025-09-07T07:39:46.3670785Z "digest": "sha256:092516f71fe325518f9737f105bcd65c40cd35c3019098889757e2c84c03c8a8" 2025-09-07T07:39:46.3671085Z }, 2025-09-07T07:39:46.3671213Z { 2025-09-07T07:39:46.3671405Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3671646Z "size": 108, 2025-09-07T07:39:46.3671888Z "digest": "sha256:8c0825014a6270f765ff514da8583d55874f3278bef76e5617e29115f91ee654" 2025-09-07T07:39:46.3672157Z }, 2025-09-07T07:39:46.3672280Z { 2025-09-07T07:39:46.3672479Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3672724Z "size": 636, 2025-09-07T07:39:46.3672976Z "digest": "sha256:8e0d2f63da0a8ff07657d7e06cdbc1ad9d5db95614d640a9f7a9aa8c30c9986d" 2025-09-07T07:39:46.3673246Z }, 2025-09-07T07:39:46.3673375Z { 2025-09-07T07:39:46.3673579Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3673823Z "size": 724, 2025-09-07T07:39:46.3674056Z "digest": "sha256:da63046995a2e510b7146776371a14bff4b31002cc3ef0322e45a3932fba2031" 2025-09-07T07:39:46.3674324Z }, 2025-09-07T07:39:46.3674447Z { 2025-09-07T07:39:46.3674649Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3674883Z "size": 148, 2025-09-07T07:39:46.3675129Z "digest": "sha256:73aae7958ba1a16c5f5625d39b06208e1def8c7816bb75028bf0845f553a5068" 2025-09-07T07:39:46.3675396Z }, 2025-09-07T07:39:46.3675520Z { 2025-09-07T07:39:46.3675708Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3675952Z "size": 136, 2025-09-07T07:39:46.3676196Z "digest": "sha256:ac6077ec9fa50fc0822d387d2ee35e1b6f1f56612402fe7195378180b25087bc" 2025-09-07T07:39:46.3676467Z }, 2025-09-07T07:39:46.3676585Z { 2025-09-07T07:39:46.3676783Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3677026Z "size": 140, 2025-09-07T07:39:46.3677273Z "digest": "sha256:bf4ee4e45e92ef179f7fc64e2c7c6755905a969c37cf82c39aafbadd9290ff04" 2025-09-07T07:39:46.3677544Z }, 2025-09-07T07:39:46.3677670Z { 2025-09-07T07:39:46.3677874Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3678123Z "size": 18617175577, 2025-09-07T07:39:46.3678382Z "digest": "sha256:c1b766f9b961bcc863d6f89d623815fd7dfe9797ddcfd5d15ef06ffe7d177359" 2025-09-07T07:39:46.3678661Z }, 2025-09-07T07:39:46.3678787Z { 2025-09-07T07:39:46.3678990Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3679285Z "size": 223, 2025-09-07T07:39:46.3679529Z "digest": "sha256:6e726ef07b5d5cfe2fb9f06d43fc931fc64c381fd37eaf0c169e0dd84796f152" 2025-09-07T07:39:46.3679805Z }, 2025-09-07T07:39:46.3679927Z { 2025-09-07T07:39:46.3680125Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3680360Z "size": 274477524, 2025-09-07T07:39:46.3680614Z "digest": "sha256:364070434a64fa913f3907ada910a4051707e693e0e6124f57bc97aa57791da1" 2025-09-07T07:39:46.3680883Z }, 2025-09-07T07:39:46.3681106Z { 2025-09-07T07:39:46.3681309Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3681569Z "size": 6451569004, 2025-09-07T07:39:46.3681839Z "digest": "sha256:71f708151a84685fc366b85e914dac9f5279313eff07358d79ecaaeecb0f1c42" 2025-09-07T07:39:46.3682119Z }, 2025-09-07T07:39:46.3682242Z { 2025-09-07T07:39:46.3682448Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3682705Z "size": 129, 2025-09-07T07:39:46.3682963Z "digest": "sha256:622d8cfb39ea4dda608d2819c6a9de45df81b6f8319ee8ab4a24c36d81b9a132" 2025-09-07T07:39:46.3683241Z }, 2025-09-07T07:39:46.3683375Z { 2025-09-07T07:39:46.3683581Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3683829Z "size": 778, 2025-09-07T07:39:46.3684070Z "digest": "sha256:284119a92cb13dacff06926444aab4f99756039acb48abba7b75d35c367ed3f1" 2025-09-07T07:39:46.3684351Z }, 2025-09-07T07:39:46.3684475Z { 2025-09-07T07:39:46.3684677Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3684968Z "size": 724, 2025-09-07T07:39:46.3685219Z "digest": "sha256:da63046995a2e510b7146776371a14bff4b31002cc3ef0322e45a3932fba2031" 2025-09-07T07:39:46.3685488Z }, 2025-09-07T07:39:46.3685615Z { 2025-09-07T07:39:46.3685806Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3686053Z "size": 140, 2025-09-07T07:39:46.3686301Z "digest": "sha256:96695940d842555623cfe4fb7b52e949423e8c8f383e55d02363e7e5c5804afa" 2025-09-07T07:39:46.3686570Z }, 2025-09-07T07:39:46.3686691Z { 2025-09-07T07:39:46.3686894Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3687147Z "size": 32, 2025-09-07T07:39:46.3687407Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3687677Z }, 2025-09-07T07:39:46.3687805Z { 2025-09-07T07:39:46.3688005Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3688249Z "size": 160, 2025-09-07T07:39:46.3688496Z "digest": "sha256:7ddca6c4c050460204097ba875dc0fa03eca6265122a18c0b8dc5504152aea53" 2025-09-07T07:39:46.3688774Z }, 2025-09-07T07:39:46.3688905Z { 2025-09-07T07:39:46.3689108Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3689346Z "size": 1012, 2025-09-07T07:39:46.3689616Z "digest": "sha256:a95e1f2f1aadef03514a7cdbdac1fe83d4eebedbb80df9be868a223f27e1c263" 2025-09-07T07:39:46.3689910Z }, 2025-09-07T07:39:46.3690039Z { 2025-09-07T07:39:46.3690232Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3690480Z "size": 724, 2025-09-07T07:39:46.3690729Z "digest": "sha256:da63046995a2e510b7146776371a14bff4b31002cc3ef0322e45a3932fba2031" 2025-09-07T07:39:46.3690994Z }, 2025-09-07T07:39:46.3691115Z { 2025-09-07T07:39:46.3691318Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3691563Z "size": 135, 2025-09-07T07:39:46.3691811Z "digest": "sha256:8085756b0cc0f9588f23a73c27840a5dff48cc18c3a2f0311e4d1ef291855679" 2025-09-07T07:39:46.3692078Z }, 2025-09-07T07:39:46.3692204Z { 2025-09-07T07:39:46.3692405Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3692654Z "size": 32, 2025-09-07T07:39:46.3692901Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3693241Z }, 2025-09-07T07:39:46.3693376Z { 2025-09-07T07:39:46.3693584Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3693830Z "size": 158, 2025-09-07T07:39:46.3694084Z "digest": "sha256:7e9ff0c6f103b18756f01c60b4d57a951660f17bffb1810b330e3ff703caf216" 2025-09-07T07:39:46.3694364Z }, 2025-09-07T07:39:46.3694499Z { 2025-09-07T07:39:46.3694700Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3694947Z "size": 1369, 2025-09-07T07:39:46.3695212Z "digest": "sha256:a625cbbc05b983aeb4c28702a4a5b65c68191ab1b8d17978f7d98cc17ddf3c52" 2025-09-07T07:39:46.3695502Z }, 2025-09-07T07:39:46.3695630Z { 2025-09-07T07:39:46.3695838Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3696094Z "size": 32, 2025-09-07T07:39:46.3696351Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3696626Z }, 2025-09-07T07:39:46.3696762Z { 2025-09-07T07:39:46.3696969Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3697222Z "size": 136, 2025-09-07T07:39:46.3697465Z "digest": "sha256:4e28486424310870c8d6815524440f17c6e0afe7572eaa173a811b98b4920bed" 2025-09-07T07:39:46.3697742Z }, 2025-09-07T07:39:46.3697876Z { 2025-09-07T07:39:46.3698084Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3698327Z "size": 380, 2025-09-07T07:39:46.3698587Z "digest": "sha256:5e944f1ed1bef9442f5b1b86225d3958ea8f2f7f4c6aa7b92dc5d0c810c260bc" 2025-09-07T07:39:46.3698875Z }, 2025-09-07T07:39:46.3699037Z { 2025-09-07T07:39:46.3699236Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3699487Z "size": 32, 2025-09-07T07:39:46.3699742Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3700024Z }, 2025-09-07T07:39:46.3700144Z { 2025-09-07T07:39:46.3700354Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3700604Z "size": 104, 2025-09-07T07:39:46.3700857Z "digest": "sha256:41619248f604c60e038a02bfd462af96ee2996b77be5f59f05e9ac5fe4790e5a" 2025-09-07T07:39:46.3701128Z }, 2025-09-07T07:39:46.3701255Z { 2025-09-07T07:39:46.3701458Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3701706Z "size": 407, 2025-09-07T07:39:46.3701957Z "digest": "sha256:be86f8c4f654b9ae64a20eb7f960e6ce4baa5b46e0a1f5e1312b11492a40bcd4" 2025-09-07T07:39:46.3702244Z }, 2025-09-07T07:39:46.3702371Z { 2025-09-07T07:39:46.3702577Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3702817Z "size": 32, 2025-09-07T07:39:46.3703072Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3703352Z }, 2025-09-07T07:39:46.3703480Z { 2025-09-07T07:39:46.3703674Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3703926Z "size": 109, 2025-09-07T07:39:46.3704183Z "digest": "sha256:ef1340e22a4bc8cf42e1d40961cb32d183cd3da8f0b785b5425c32ee067690c1" 2025-09-07T07:39:46.3704464Z }, 2025-09-07T07:39:46.3704583Z { 2025-09-07T07:39:46.3704787Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3705038Z "size": 1897, 2025-09-07T07:39:46.3705298Z "digest": "sha256:da8d8b696333cbf6b9f339ab859639c905d6752d7e65fea14c23c3c2dcba553e" 2025-09-07T07:39:46.3705570Z }, 2025-09-07T07:39:46.3705695Z { 2025-09-07T07:39:46.3705905Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3706159Z "size": 243443118, 2025-09-07T07:39:46.3706416Z "digest": "sha256:386b0c49c4982a821fb6f427fbc7d9c7d2012e97c96a514a9c7a09304e76b935" 2025-09-07T07:39:46.3706695Z }, 2025-09-07T07:39:46.3706822Z { 2025-09-07T07:39:46.3707025Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3707300Z "size": 106, 2025-09-07T07:39:46.3707565Z "digest": "sha256:2b1d0ea7efe0bf86e86df804d2cddbf83b113fdecd03f3ddfca728da30546f34" 2025-09-07T07:39:46.3707851Z }, 2025-09-07T07:39:46.3707980Z { 2025-09-07T07:39:46.3708175Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3708420Z "size": 163, 2025-09-07T07:39:46.3708677Z "digest": "sha256:04c04be7408f20625b1bd8454e5a08c91fcf04d4f79ab3ec1b75ae6b1824174d" 2025-09-07T07:39:46.3708956Z }, 2025-09-07T07:39:46.3709076Z { 2025-09-07T07:39:46.3709284Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3709531Z "size": 7943, 2025-09-07T07:39:46.3709794Z "digest": "sha256:f8690caa3ac5e845f2dcc25ad12815b5c7452285c3838a87c780bd03ecf072a3" 2025-09-07T07:39:46.3710072Z }, 2025-09-07T07:39:46.3710203Z { 2025-09-07T07:39:46.3710412Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3710655Z "size": 8074, 2025-09-07T07:39:46.3710904Z "digest": "sha256:2908d6baaa6b21331dee5f210472cae0874d22b98b0a35420cad4fd753ed215f" 2025-09-07T07:39:46.3711179Z }, 2025-09-07T07:39:46.3711314Z { 2025-09-07T07:39:46.3711519Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3711755Z "size": 303, 2025-09-07T07:39:46.3712000Z "digest": "sha256:37e2336101eba2c73995d34431e4fae8782d9e9700c42621777922490b2158ed" 2025-09-07T07:39:46.3712265Z }, 2025-09-07T07:39:46.3712395Z { 2025-09-07T07:39:46.3712589Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3712831Z "size": 32, 2025-09-07T07:39:46.3713115Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3713394Z }, 2025-09-07T07:39:46.3713512Z { 2025-09-07T07:39:46.3713704Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3713942Z "size": 108, 2025-09-07T07:39:46.3714174Z "digest": "sha256:f1ac881fde33994861be4324231269058643168b9aee60c699552d0d92d965da" 2025-09-07T07:39:46.3714434Z }, 2025-09-07T07:39:46.3714552Z { 2025-09-07T07:39:46.3714744Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3714984Z "size": 54145699, 2025-09-07T07:39:46.3715228Z "digest": "sha256:43b14c67347e2813c5f63e928c14db60dbb35c330ccc865510cf79739d8b78a1" 2025-09-07T07:39:46.3715494Z }, 2025-09-07T07:39:46.3715612Z { 2025-09-07T07:39:46.3715809Z "mediaType": "application/vnd.docker.image.rootfs.diff.tar.gzip", 2025-09-07T07:39:46.3716041Z "size": 32, 2025-09-07T07:39:46.3716291Z "digest": "sha256:4f4fb700ef54461cfa02571ae0db9a0dc1e0cdb5577484a6d75e68dc38e8acc1" 2025-09-07T07:39:46.3716561Z } 2025-09-07T07:39:46.3716686Z ] 2025-09-07T07:39:46.3716804Z } 2025-09-07T07:39:46.3734563Z ##[group]Run set -eux 2025-09-07T07:39:46.3734755Z set -eux 2025-09-07T07:39:46.3735021Z # It's ok if this steps fails, it would then be an anonymous user like what we used to have 2025-09-07T07:39:46.3735682Z aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token | jq --raw-output '.SecretString' | jq -r .docker_hub_readonly_token | docker login --username pytorchbot --password-stdin || true 2025-09-07T07:39:46.3740883Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:46.3741114Z env: 2025-09-07T07:39:46.3741272Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:46.3741456Z ##[endgroup] 2025-09-07T07:39:46.3761942Z + aws secretsmanager get-secret-value --secret-id docker_hub_readonly_token 2025-09-07T07:39:46.3762507Z + jq --raw-output .SecretString 2025-09-07T07:39:46.3763283Z + jq -r .docker_hub_readonly_token 2025-09-07T07:39:46.3764023Z + docker login --username pytorchbot --password-stdin 2025-09-07T07:39:46.7946830Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-09-07T07:39:46.7947148Z Login Succeeded 2025-09-07T07:39:46.7947373Z Configure a credential helper to remove this warning. See 2025-09-07T07:39:46.7947947Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-09-07T07:39:46.7948177Z 2025-09-07T07:39:46.8003399Z ##[group]Run tag=${ECR_DOCKER_IMAGE##*:} 2025-09-07T07:39:46.8003645Z tag=${ECR_DOCKER_IMAGE##*:} 2025-09-07T07:39:46.8003895Z echo "docker pull ghcr.io/pytorch/ci-image:${tag/:/-}" 2025-09-07T07:39:46.8008077Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:46.8008311Z env: 2025-09-07T07:39:46.8008461Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:46.8008985Z ECR_DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:46.8009494Z ##[endgroup] 2025-09-07T07:39:46.8028910Z docker pull ghcr.io/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:46.8056988Z ##[group]Run pytorch/test-infra/.github/actions/pull-docker-image@main 2025-09-07T07:39:46.8057288Z with: 2025-09-07T07:39:46.8057811Z docker-image: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:46.8058418Z docker-registry: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:46.8058673Z env: 2025-09-07T07:39:46.8058836Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:46.8059027Z ##[endgroup] 2025-09-07T07:39:46.8127184Z ##[group]Run set -x 2025-09-07T07:39:46.8127366Z set -x 2025-09-07T07:39:46.8127512Z set +e 2025-09-07T07:39:46.8127657Z  2025-09-07T07:39:46.8127792Z login() { 2025-09-07T07:39:46.8128088Z  aws ecr get-login-password --region us-east-1 | docker login -u AWS --password-stdin "$1" 2025-09-07T07:39:46.8128390Z } 2025-09-07T07:39:46.8128533Z  2025-09-07T07:39:46.8128696Z retry () { 2025-09-07T07:39:46.8128879Z  $* || (sleep 1 && $*) || (sleep 2 && $*) 2025-09-07T07:39:46.8129066Z } 2025-09-07T07:39:46.8129200Z  2025-09-07T07:39:46.8129366Z retry login "${DOCKER_REGISTRY}" 2025-09-07T07:39:46.8129554Z  2025-09-07T07:39:46.8129833Z IMAGE_SIZE=$(docker manifest inspect "${DOCKER_IMAGE}" | jq '[.layers[].size, .config.size] | add / 1024 / 1024') 2025-09-07T07:39:46.8130214Z echo "Compressed size of image in MB: ${IMAGE_SIZE}" 2025-09-07T07:39:46.8130439Z  2025-09-07T07:39:46.8130572Z set -e 2025-09-07T07:39:46.8130781Z # ignore output since only exit code is used for conditional 2025-09-07T07:39:46.8131062Z # only pull docker image if it's not available locally 2025-09-07T07:39:46.8131378Z if ! docker inspect --type=image "${DOCKER_IMAGE}" >/dev/null 2>/dev/null; then 2025-09-07T07:39:46.8131670Z  retry docker pull "${DOCKER_IMAGE}" 2025-09-07T07:39:46.8131866Z fi 2025-09-07T07:39:46.8135742Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:39:46.8135962Z env: 2025-09-07T07:39:46.8136105Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:39:46.8136598Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:46.8137152Z DOCKER_REGISTRY: 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:46.8137374Z ##[endgroup] 2025-09-07T07:39:46.8153984Z + set +e 2025-09-07T07:39:46.8154269Z + retry login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:46.8154539Z + login 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:46.8158052Z + aws ecr get-login-password --region us-east-1 2025-09-07T07:39:46.8158556Z + docker login -u AWS --password-stdin 308535385114.dkr.ecr.us-east-1.amazonaws.com 2025-09-07T07:39:47.1885708Z WARNING! Your password will be stored unencrypted in /home/ec2-user/.docker/config.json. 2025-09-07T07:39:47.1886309Z Configure a credential helper to remove this warning. See 2025-09-07T07:39:47.1886650Z https://docs.docker.com/engine/reference/commandline/login/#credentials-store 2025-09-07T07:39:47.1886867Z 2025-09-07T07:39:47.1886945Z Login Succeeded 2025-09-07T07:39:47.1902438Z ++ docker manifest inspect 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:47.1903027Z ++ jq '[.layers[].size, .config.size] | add / 1024 / 1024' 2025-09-07T07:39:47.3997444Z + IMAGE_SIZE=28579.020259857178 2025-09-07T07:39:47.3997838Z + echo 'Compressed size of image in MB: 28579.020259857178' 2025-09-07T07:39:47.3998109Z Compressed size of image in MB: 28579.020259857178 2025-09-07T07:39:47.3998353Z + set -e 2025-09-07T07:39:47.3999139Z + docker inspect --type=image 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:47.4112543Z + retry docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:47.4113397Z + docker pull 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:39:47.7119606Z pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77: Pulling from pytorch/ci-image 2025-09-07T07:39:47.7120053Z e6fdc8487bfe: Pulling fs layer 2025-09-07T07:39:47.7120245Z 18a5ee5b0e2e: Pulling fs layer 2025-09-07T07:39:47.7120418Z 572424b92528: Pulling fs layer 2025-09-07T07:39:47.7120584Z 1c35b7d4b67c: Pulling fs layer 2025-09-07T07:39:47.7120747Z 68c20f3c23bb: Pulling fs layer 2025-09-07T07:39:47.7120915Z 7efa39950d32: Pulling fs layer 2025-09-07T07:39:47.7121096Z a10eb16a7271: Pulling fs layer 2025-09-07T07:39:47.7121273Z 7d52cf579654: Pulling fs layer 2025-09-07T07:39:47.7121437Z cb6a20fcf4e2: Pulling fs layer 2025-09-07T07:39:47.7121612Z 46fb6a8b3e1d: Pulling fs layer 2025-09-07T07:39:47.7121789Z 5ad6977cc38e: Pulling fs layer 2025-09-07T07:39:47.7121962Z da63046995a2: Pulling fs layer 2025-09-07T07:39:47.7122123Z 78243fdb9906: Pulling fs layer 2025-09-07T07:39:47.7122305Z 6f70d5d50aba: Pulling fs layer 2025-09-07T07:39:47.7122519Z 4f4fb700ef54: Pulling fs layer 2025-09-07T07:39:47.7122689Z 69715d3ad3c4: Pulling fs layer 2025-09-07T07:39:47.7122856Z 7ace90c063f3: Pulling fs layer 2025-09-07T07:39:47.7123019Z acbd5447dd14: Pulling fs layer 2025-09-07T07:39:47.7123189Z 744523d9b7f5: Pulling fs layer 2025-09-07T07:39:47.7123358Z 5bd615a7b945: Pulling fs layer 2025-09-07T07:39:47.7123526Z f4986a00e3ae: Pulling fs layer 2025-09-07T07:39:47.7123683Z 21902f6e4f8c: Pulling fs layer 2025-09-07T07:39:47.7123851Z d80602abf3cc: Pulling fs layer 2025-09-07T07:39:47.7124025Z 3c51bf0bc362: Pulling fs layer 2025-09-07T07:39:47.7124197Z 7d52cf579654: Waiting 2025-09-07T07:39:47.7124359Z 119ab3bceafa: Pulling fs layer 2025-09-07T07:39:47.7124538Z af8eadc9eaab: Pulling fs layer 2025-09-07T07:39:47.7124715Z e7769b0d7a82: Pulling fs layer 2025-09-07T07:39:47.7124889Z ba263639b0f4: Pulling fs layer 2025-09-07T07:39:47.7125046Z cb6a20fcf4e2: Waiting 2025-09-07T07:39:47.7125215Z a5ab7a280382: Pulling fs layer 2025-09-07T07:39:47.7125380Z 80b2232d952f: Pulling fs layer 2025-09-07T07:39:47.7125543Z 46fb6a8b3e1d: Waiting 2025-09-07T07:39:47.7125686Z 5ad6977cc38e: Waiting 2025-09-07T07:39:47.7125831Z da63046995a2: Waiting 2025-09-07T07:39:47.7125986Z cc93cd65e90f: Pulling fs layer 2025-09-07T07:39:47.7126156Z 0eed4c15712b: Pulling fs layer 2025-09-07T07:39:47.7126308Z 78243fdb9906: Waiting 2025-09-07T07:39:47.7126458Z 092516f71fe3: Pulling fs layer 2025-09-07T07:39:47.7126620Z 8c0825014a62: Pulling fs layer 2025-09-07T07:39:47.7126786Z 8e0d2f63da0a: Pulling fs layer 2025-09-07T07:39:47.7126941Z 6f70d5d50aba: Waiting 2025-09-07T07:39:47.7127275Z 73aae7958ba1: Pulling fs layer 2025-09-07T07:39:47.7127439Z 4f4fb700ef54: Waiting 2025-09-07T07:39:47.7127594Z ac6077ec9fa5: Pulling fs layer 2025-09-07T07:39:47.7127744Z 69715d3ad3c4: Waiting 2025-09-07T07:39:47.7127903Z bf4ee4e45e92: Pulling fs layer 2025-09-07T07:39:47.7128070Z c1b766f9b961: Pulling fs layer 2025-09-07T07:39:47.7128233Z 7ace90c063f3: Waiting 2025-09-07T07:39:47.7128381Z 6e726ef07b5d: Pulling fs layer 2025-09-07T07:39:47.7128543Z acbd5447dd14: Waiting 2025-09-07T07:39:47.7128693Z 364070434a64: Pulling fs layer 2025-09-07T07:39:47.7128850Z 744523d9b7f5: Waiting 2025-09-07T07:39:47.7128993Z 71f708151a84: Pulling fs layer 2025-09-07T07:39:47.7129155Z 622d8cfb39ea: Pulling fs layer 2025-09-07T07:39:47.7129317Z 5bd615a7b945: Waiting 2025-09-07T07:39:47.7129471Z 284119a92cb1: Pulling fs layer 2025-09-07T07:39:47.7129623Z f4986a00e3ae: Waiting 2025-09-07T07:39:47.7129776Z 96695940d842: Pulling fs layer 2025-09-07T07:39:47.7130020Z 21902f6e4f8c: Waiting 2025-09-07T07:39:47.7130179Z 7ddca6c4c050: Pulling fs layer 2025-09-07T07:39:47.7130339Z d80602abf3cc: Waiting 2025-09-07T07:39:47.7130494Z a95e1f2f1aad: Pulling fs layer 2025-09-07T07:39:47.7130658Z 3c51bf0bc362: Waiting 2025-09-07T07:39:47.7130803Z 8085756b0cc0: Pulling fs layer 2025-09-07T07:39:47.7130968Z 68c20f3c23bb: Waiting 2025-09-07T07:39:47.7131126Z 7e9ff0c6f103: Pulling fs layer 2025-09-07T07:39:47.7131300Z a625cbbc05b9: Pulling fs layer 2025-09-07T07:39:47.7131455Z 7efa39950d32: Waiting 2025-09-07T07:39:47.7131612Z 4e2848642431: Pulling fs layer 2025-09-07T07:39:47.7131782Z 5e944f1ed1be: Pulling fs layer 2025-09-07T07:39:47.7131954Z 41619248f604: Pulling fs layer 2025-09-07T07:39:47.7132107Z a10eb16a7271: Waiting 2025-09-07T07:39:47.7132263Z be86f8c4f654: Pulling fs layer 2025-09-07T07:39:47.7132431Z ef1340e22a4b: Pulling fs layer 2025-09-07T07:39:47.7132595Z a95e1f2f1aad: Waiting 2025-09-07T07:39:47.7132740Z da8d8b696333: Pulling fs layer 2025-09-07T07:39:47.7132901Z 1c35b7d4b67c: Waiting 2025-09-07T07:39:47.7133054Z 386b0c49c498: Pulling fs layer 2025-09-07T07:39:47.7133219Z 8085756b0cc0: Waiting 2025-09-07T07:39:47.7133356Z 4e2848642431: Waiting 2025-09-07T07:39:47.7133506Z 2b1d0ea7efe0: Pulling fs layer 2025-09-07T07:39:47.7133668Z 7e9ff0c6f103: Waiting 2025-09-07T07:39:47.7133816Z 119ab3bceafa: Waiting 2025-09-07T07:39:47.7133956Z 5e944f1ed1be: Waiting 2025-09-07T07:39:47.7134103Z a625cbbc05b9: Waiting 2025-09-07T07:39:47.7134256Z 04c04be7408f: Pulling fs layer 2025-09-07T07:39:47.7134422Z f8690caa3ac5: Pulling fs layer 2025-09-07T07:39:47.7134581Z 2908d6baaa6b: Pulling fs layer 2025-09-07T07:39:47.7134740Z 364070434a64: Waiting 2025-09-07T07:39:47.7134883Z 0eed4c15712b: Waiting 2025-09-07T07:39:47.7135032Z 37e2336101eb: Pulling fs layer 2025-09-07T07:39:47.7135180Z 6e726ef07b5d: Waiting 2025-09-07T07:39:47.7135327Z af8eadc9eaab: Waiting 2025-09-07T07:39:47.7135470Z 80b2232d952f: Waiting 2025-09-07T07:39:47.7135619Z f1ac881fde33: Pulling fs layer 2025-09-07T07:39:47.7135773Z 092516f71fe3: Waiting 2025-09-07T07:39:47.7135918Z 284119a92cb1: Waiting 2025-09-07T07:39:47.7136066Z e7769b0d7a82: Waiting 2025-09-07T07:39:47.7136241Z 43b14c67347e: Pulling fs layer 2025-09-07T07:39:47.7136404Z 71f708151a84: Waiting 2025-09-07T07:39:47.7136550Z 96695940d842: Waiting 2025-09-07T07:39:47.7136693Z 8c0825014a62: Waiting 2025-09-07T07:39:47.7136830Z 622d8cfb39ea: Waiting 2025-09-07T07:39:47.7136981Z cc93cd65e90f: Waiting 2025-09-07T07:39:47.7137130Z ba263639b0f4: Waiting 2025-09-07T07:39:47.7137280Z 7ddca6c4c050: Waiting 2025-09-07T07:39:47.7137422Z a5ab7a280382: Waiting 2025-09-07T07:39:47.7137569Z bf4ee4e45e92: Waiting 2025-09-07T07:39:47.7137723Z 8e0d2f63da0a: Waiting 2025-09-07T07:39:47.7137873Z c1b766f9b961: Waiting 2025-09-07T07:39:47.7138016Z da8d8b696333: Waiting 2025-09-07T07:39:47.7138167Z 2b1d0ea7efe0: Waiting 2025-09-07T07:39:47.7138327Z 41619248f604: Waiting 2025-09-07T07:39:47.7138477Z f8690caa3ac5: Waiting 2025-09-07T07:39:47.7138617Z 73aae7958ba1: Waiting 2025-09-07T07:39:47.7138770Z 386b0c49c498: Waiting 2025-09-07T07:39:47.7138916Z 37e2336101eb: Waiting 2025-09-07T07:39:47.7139855Z 04c04be7408f: Waiting 2025-09-07T07:39:47.7140001Z 2908d6baaa6b: Waiting 2025-09-07T07:39:47.7140152Z ac6077ec9fa5: Waiting 2025-09-07T07:39:47.7140310Z ef1340e22a4b: Waiting 2025-09-07T07:39:47.7140462Z be86f8c4f654: Waiting 2025-09-07T07:39:47.7140606Z f1ac881fde33: Waiting 2025-09-07T07:39:47.7140756Z 43b14c67347e: Waiting 2025-09-07T07:39:47.7740597Z 18a5ee5b0e2e: Verifying Checksum 2025-09-07T07:39:47.7740823Z 18a5ee5b0e2e: Download complete 2025-09-07T07:39:47.8568558Z 1c35b7d4b67c: Verifying Checksum 2025-09-07T07:39:47.8568769Z 1c35b7d4b67c: Download complete 2025-09-07T07:39:47.9399129Z 68c20f3c23bb: Download complete 2025-09-07T07:39:48.0144624Z 7efa39950d32: Verifying Checksum 2025-09-07T07:39:48.0145006Z 7efa39950d32: Download complete 2025-09-07T07:39:48.0788603Z e6fdc8487bfe: Verifying Checksum 2025-09-07T07:39:48.0788849Z e6fdc8487bfe: Download complete 2025-09-07T07:39:48.1144371Z a10eb16a7271: Download complete 2025-09-07T07:39:48.1775562Z 7d52cf579654: Verifying Checksum 2025-09-07T07:39:48.1777185Z 7d52cf579654: Download complete 2025-09-07T07:39:48.3075082Z 46fb6a8b3e1d: Download complete 2025-09-07T07:39:48.3924075Z 5ad6977cc38e: Verifying Checksum 2025-09-07T07:39:48.3924361Z 5ad6977cc38e: Download complete 2025-09-07T07:39:48.4594265Z da63046995a2: Verifying Checksum 2025-09-07T07:39:48.4594563Z da63046995a2: Download complete 2025-09-07T07:39:48.5567421Z 78243fdb9906: Download complete 2025-09-07T07:39:48.6862853Z e6fdc8487bfe: Pull complete 2025-09-07T07:39:48.6972623Z 18a5ee5b0e2e: Pull complete 2025-09-07T07:39:49.2634782Z cb6a20fcf4e2: Verifying Checksum 2025-09-07T07:39:49.2635039Z cb6a20fcf4e2: Download complete 2025-09-07T07:39:49.2713276Z 4f4fb700ef54: Verifying Checksum 2025-09-07T07:39:49.2713547Z 4f4fb700ef54: Download complete 2025-09-07T07:39:49.3581147Z 69715d3ad3c4: Download complete 2025-09-07T07:39:49.4599463Z 7ace90c063f3: Verifying Checksum 2025-09-07T07:39:49.4599826Z 7ace90c063f3: Download complete 2025-09-07T07:39:49.5512503Z acbd5447dd14: Verifying Checksum 2025-09-07T07:39:49.5512749Z acbd5447dd14: Download complete 2025-09-07T07:39:49.6663786Z 744523d9b7f5: Verifying Checksum 2025-09-07T07:39:49.6664070Z 744523d9b7f5: Download complete 2025-09-07T07:39:49.7722996Z 5bd615a7b945: Download complete 2025-09-07T07:39:49.8684945Z f4986a00e3ae: Verifying Checksum 2025-09-07T07:39:49.8685228Z f4986a00e3ae: Download complete 2025-09-07T07:39:49.9319688Z 21902f6e4f8c: Verifying Checksum 2025-09-07T07:39:49.9319922Z 21902f6e4f8c: Download complete 2025-09-07T07:39:50.0223734Z d80602abf3cc: Verifying Checksum 2025-09-07T07:39:50.0224020Z d80602abf3cc: Download complete 2025-09-07T07:39:50.1097473Z 3c51bf0bc362: Download complete 2025-09-07T07:39:50.2015929Z 119ab3bceafa: Verifying Checksum 2025-09-07T07:39:50.2016218Z 119ab3bceafa: Download complete 2025-09-07T07:39:50.2894078Z af8eadc9eaab: Verifying Checksum 2025-09-07T07:39:50.2894355Z af8eadc9eaab: Download complete 2025-09-07T07:39:50.3508800Z e7769b0d7a82: Download complete 2025-09-07T07:39:50.9124887Z 572424b92528: Verifying Checksum 2025-09-07T07:39:50.9125221Z 572424b92528: Download complete 2025-09-07T07:39:50.9877286Z a5ab7a280382: Verifying Checksum 2025-09-07T07:39:50.9877632Z a5ab7a280382: Download complete 2025-09-07T07:39:51.0764952Z 80b2232d952f: Download complete 2025-09-07T07:39:51.1590206Z cc93cd65e90f: Download complete 2025-09-07T07:39:51.2362016Z 0eed4c15712b: Verifying Checksum 2025-09-07T07:39:51.2362355Z 0eed4c15712b: Download complete 2025-09-07T07:39:51.4694650Z 092516f71fe3: Verifying Checksum 2025-09-07T07:39:51.4694995Z 092516f71fe3: Download complete 2025-09-07T07:39:51.5549472Z 8c0825014a62: Download complete 2025-09-07T07:39:51.6405962Z 8e0d2f63da0a: Verifying Checksum 2025-09-07T07:39:51.6406346Z 8e0d2f63da0a: Download complete 2025-09-07T07:39:51.7159293Z 73aae7958ba1: Verifying Checksum 2025-09-07T07:39:51.7161049Z 73aae7958ba1: Download complete 2025-09-07T07:39:51.7933132Z ac6077ec9fa5: Download complete 2025-09-07T07:39:51.8495117Z bf4ee4e45e92: Verifying Checksum 2025-09-07T07:39:51.8496432Z bf4ee4e45e92: Download complete 2025-09-07T07:39:54.9845774Z ba263639b0f4: Verifying Checksum 2025-09-07T07:39:54.9846130Z ba263639b0f4: Download complete 2025-09-07T07:39:55.0563938Z 6e726ef07b5d: Download complete 2025-09-07T07:39:57.0488882Z 572424b92528: Pull complete 2025-09-07T07:39:57.3302436Z 1c35b7d4b67c: Pull complete 2025-09-07T07:39:57.5994435Z 68c20f3c23bb: Pull complete 2025-09-07T07:39:57.8536836Z 364070434a64: Verifying Checksum 2025-09-07T07:39:57.8537184Z 364070434a64: Download complete 2025-09-07T07:39:57.9338258Z 7efa39950d32: Pull complete 2025-09-07T07:39:58.2850271Z a10eb16a7271: Pull complete 2025-09-07T07:39:58.5416531Z 7d52cf579654: Pull complete 2025-09-07T07:40:00.3107618Z cb6a20fcf4e2: Pull complete 2025-09-07T07:40:00.4476572Z 46fb6a8b3e1d: Pull complete 2025-09-07T07:40:00.6675436Z 5ad6977cc38e: Pull complete 2025-09-07T07:40:00.9389954Z da63046995a2: Pull complete 2025-09-07T07:40:01.2063062Z 78243fdb9906: Pull complete 2025-09-07T07:40:22.5531886Z 6f70d5d50aba: Verifying Checksum 2025-09-07T07:40:22.5532284Z 6f70d5d50aba: Download complete 2025-09-07T07:40:22.6384551Z 622d8cfb39ea: Download complete 2025-09-07T07:40:22.7219647Z 284119a92cb1: Verifying Checksum 2025-09-07T07:40:22.7219993Z 284119a92cb1: Download complete 2025-09-07T07:40:22.8062254Z 96695940d842: Verifying Checksum 2025-09-07T07:40:22.8062582Z 96695940d842: Download complete 2025-09-07T07:40:22.8962847Z 7ddca6c4c050: Verifying Checksum 2025-09-07T07:40:22.8963213Z 7ddca6c4c050: Download complete 2025-09-07T07:40:22.9825300Z a95e1f2f1aad: Verifying Checksum 2025-09-07T07:40:22.9825563Z a95e1f2f1aad: Download complete 2025-09-07T07:40:23.0777718Z 8085756b0cc0: Download complete 2025-09-07T07:40:23.1714020Z 7e9ff0c6f103: Verifying Checksum 2025-09-07T07:40:23.1714360Z 7e9ff0c6f103: Download complete 2025-09-07T07:40:23.2617047Z a625cbbc05b9: Verifying Checksum 2025-09-07T07:40:23.2617391Z a625cbbc05b9: Download complete 2025-09-07T07:40:23.3475066Z 4e2848642431: Verifying Checksum 2025-09-07T07:40:23.3475294Z 4e2848642431: Download complete 2025-09-07T07:40:23.4154988Z 5e944f1ed1be: Verifying Checksum 2025-09-07T07:40:23.4155262Z 5e944f1ed1be: Download complete 2025-09-07T07:40:23.4746614Z 41619248f604: Verifying Checksum 2025-09-07T07:40:23.4746862Z 41619248f604: Download complete 2025-09-07T07:40:23.5406609Z be86f8c4f654: Download complete 2025-09-07T07:40:23.6304954Z ef1340e22a4b: Download complete 2025-09-07T07:40:23.7081747Z da8d8b696333: Verifying Checksum 2025-09-07T07:40:23.7082245Z da8d8b696333: Download complete 2025-09-07T07:40:26.1836726Z 386b0c49c498: Verifying Checksum 2025-09-07T07:40:26.1837037Z 386b0c49c498: Download complete 2025-09-07T07:40:26.2467532Z 2b1d0ea7efe0: Download complete 2025-09-07T07:40:26.3444691Z 04c04be7408f: Download complete 2025-09-07T07:40:26.4146463Z f8690caa3ac5: Download complete 2025-09-07T07:40:26.5223472Z 2908d6baaa6b: Verifying Checksum 2025-09-07T07:40:26.5223813Z 2908d6baaa6b: Download complete 2025-09-07T07:40:26.5914923Z 37e2336101eb: Verifying Checksum 2025-09-07T07:40:26.5915191Z 37e2336101eb: Download complete 2025-09-07T07:40:26.6798419Z f1ac881fde33: Verifying Checksum 2025-09-07T07:40:26.6798670Z f1ac881fde33: Download complete 2025-09-07T07:40:27.2600823Z 43b14c67347e: Verifying Checksum 2025-09-07T07:40:27.2601230Z 43b14c67347e: Download complete 2025-09-07T07:40:58.5537859Z 6f70d5d50aba: Pull complete 2025-09-07T07:40:58.8372120Z 4f4fb700ef54: Pull complete 2025-09-07T07:40:59.1621241Z 69715d3ad3c4: Pull complete 2025-09-07T07:40:59.5403755Z 7ace90c063f3: Pull complete 2025-09-07T07:40:59.8868790Z acbd5447dd14: Pull complete 2025-09-07T07:41:00.4486443Z 744523d9b7f5: Pull complete 2025-09-07T07:41:01.0250847Z 5bd615a7b945: Pull complete 2025-09-07T07:41:01.3866901Z f4986a00e3ae: Pull complete 2025-09-07T07:41:01.7018501Z 21902f6e4f8c: Pull complete 2025-09-07T07:41:02.0761182Z d80602abf3cc: Pull complete 2025-09-07T07:41:02.3978786Z 3c51bf0bc362: Pull complete 2025-09-07T07:41:02.4294041Z 71f708151a84: Verifying Checksum 2025-09-07T07:41:02.4294575Z 71f708151a84: Download complete 2025-09-07T07:41:02.7936548Z 119ab3bceafa: Pull complete 2025-09-07T07:41:03.8224957Z af8eadc9eaab: Pull complete 2025-09-07T07:41:04.2807759Z e7769b0d7a82: Pull complete 2025-09-07T07:41:10.0149245Z ba263639b0f4: Pull complete 2025-09-07T07:41:10.5898355Z a5ab7a280382: Pull complete 2025-09-07T07:41:11.0884819Z 80b2232d952f: Pull complete 2025-09-07T07:41:12.0130605Z cc93cd65e90f: Pull complete 2025-09-07T07:41:12.4428442Z 0eed4c15712b: Pull complete 2025-09-07T07:41:13.0931040Z 092516f71fe3: Pull complete 2025-09-07T07:41:13.5921043Z 8c0825014a62: Pull complete 2025-09-07T07:41:14.0491527Z 8e0d2f63da0a: Pull complete 2025-09-07T07:41:14.9034879Z 73aae7958ba1: Pull complete 2025-09-07T07:41:15.3752673Z ac6077ec9fa5: Pull complete 2025-09-07T07:41:15.8270389Z bf4ee4e45e92: Pull complete 2025-09-07T07:42:58.0759839Z c1b766f9b961: Download complete 2025-09-07T07:44:56.9823742Z c1b766f9b961: Pull complete 2025-09-07T07:44:57.5516586Z 6e726ef07b5d: Pull complete 2025-09-07T07:44:59.0744696Z 364070434a64: Pull complete 2025-09-07T07:46:10.0915688Z 71f708151a84: Pull complete 2025-09-07T07:46:10.4922399Z 622d8cfb39ea: Pull complete 2025-09-07T07:46:10.8428945Z 284119a92cb1: Pull complete 2025-09-07T07:46:11.4448044Z 96695940d842: Pull complete 2025-09-07T07:46:12.1068017Z 7ddca6c4c050: Pull complete 2025-09-07T07:46:12.6266891Z a95e1f2f1aad: Pull complete 2025-09-07T07:46:13.2917868Z 8085756b0cc0: Pull complete 2025-09-07T07:46:14.2013598Z 7e9ff0c6f103: Pull complete 2025-09-07T07:46:14.6601706Z a625cbbc05b9: Pull complete 2025-09-07T07:46:15.6011846Z 4e2848642431: Pull complete 2025-09-07T07:46:16.0646445Z 5e944f1ed1be: Pull complete 2025-09-07T07:46:16.7746817Z 41619248f604: Pull complete 2025-09-07T07:46:17.0256012Z be86f8c4f654: Pull complete 2025-09-07T07:46:17.9146398Z ef1340e22a4b: Pull complete 2025-09-07T07:46:18.3821072Z da8d8b696333: Pull complete 2025-09-07T07:46:23.3363766Z 386b0c49c498: Pull complete 2025-09-07T07:46:23.7076400Z 2b1d0ea7efe0: Pull complete 2025-09-07T07:46:24.0494290Z 04c04be7408f: Pull complete 2025-09-07T07:46:24.5078164Z f8690caa3ac5: Pull complete 2025-09-07T07:46:25.0783635Z 2908d6baaa6b: Pull complete 2025-09-07T07:46:25.5477182Z 37e2336101eb: Pull complete 2025-09-07T07:46:26.3214699Z f1ac881fde33: Pull complete 2025-09-07T07:46:27.7965970Z 43b14c67347e: Pull complete 2025-09-07T07:46:28.4559451Z Digest: sha256:383efb45082f20b8c808cb0ba4df693a01359592233f641f1f486911ac320a9a 2025-09-07T07:46:28.5622709Z Status: Downloaded newer image for 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:46:28.5914301Z 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:46:28.5969355Z ##[group]Run echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:46:28.5969949Z echo "IN_CONTAINER_RUNNER=$(if [ -f /.inarc ] || [ -f /.incontainer ]; then echo true ; else echo false; fi)" >> "$GITHUB_OUTPUT" 2025-09-07T07:46:28.5977710Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:28.5977948Z env: 2025-09-07T07:46:28.5978112Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:28.5978299Z ##[endgroup] 2025-09-07T07:46:28.6042976Z Prepare all required actions 2025-09-07T07:46:28.6087756Z ##[group]Run ./.github/actions/get-workflow-job-id 2025-09-07T07:46:28.6087973Z with: 2025-09-07T07:46:28.6088461Z github-token: *** 2025-09-07T07:46:28.6088608Z env: 2025-09-07T07:46:28.6088754Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:28.6088927Z ##[endgroup] 2025-09-07T07:46:28.6229036Z ##[group]Run set -eux 2025-09-07T07:46:28.6229205Z set -eux 2025-09-07T07:46:28.6229472Z python3 .github/scripts/get_workflow_job_id.py "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-09-07T07:46:28.6233172Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:28.6233536Z env: 2025-09-07T07:46:28.6233680Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:28.6233995Z GITHUB_TOKEN: *** 2025-09-07T07:46:28.6234147Z ##[endgroup] 2025-09-07T07:46:28.6251661Z + python3 .github/scripts/get_workflow_job_id.py 17525285611 i-081e6be8c4291059d 2025-09-07T07:46:29.2096981Z Setting output job-id=49775585769 2025-09-07T07:46:29.2097499Z Setting output job-name=inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:29.2195073Z ##[group]Run python3 -m pip install psutil==5.9.8 dataclasses_json==0.6.7 nvidia-ml-py==11.525.84 2025-09-07T07:46:29.2195500Z python3 -m pip install psutil==5.9.8 dataclasses_json==0.6.7 nvidia-ml-py==11.525.84 2025-09-07T07:46:29.2196031Z python3 -m tools.stats.monitor --log-interval "$MONITOR_LOG_INTERVAL" --data-collect-interval "$MONITOR_DATA_COLLECT_INTERVAL" > usage_log.txt 2>&1 & 2025-09-07T07:46:29.2196507Z echo "monitor-script-pid=${!}" >> "${GITHUB_OUTPUT}" 2025-09-07T07:46:29.2200563Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:29.2200779Z env: 2025-09-07T07:46:29.2200924Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:29.2201090Z JOB_ID: 49775585769 2025-09-07T07:46:29.2201412Z JOB_NAME: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:29.2201766Z WORKFLOW_NAME: inductor-perf-nightly-x86 2025-09-07T07:46:29.2202007Z WORKFLOW_RUN_ID: 17525285611 2025-09-07T07:46:29.2202192Z MONITOR_LOG_INTERVAL: 15 2025-09-07T07:46:29.2202359Z MONITOR_DATA_COLLECT_INTERVAL: 4 2025-09-07T07:46:29.2202540Z ##[endgroup] 2025-09-07T07:46:29.6694110Z Defaulting to user installation because normal site-packages is not writeable 2025-09-07T07:46:30.1421836Z Collecting psutil==5.9.8 2025-09-07T07:46:30.1531318Z Downloading psutil-5.9.8-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (288 kB) 2025-09-07T07:46:30.2863438Z Collecting dataclasses_json==0.6.7 2025-09-07T07:46:30.2887918Z Downloading dataclasses_json-0.6.7-py3-none-any.whl (28 kB) 2025-09-07T07:46:30.3452863Z Collecting nvidia-ml-py==11.525.84 2025-09-07T07:46:30.3474741Z Downloading nvidia_ml_py-11.525.84-py3-none-any.whl (34 kB) 2025-09-07T07:46:30.3969832Z Collecting typing-inspect<1,>=0.4.0 2025-09-07T07:46:30.3992400Z Downloading typing_inspect-0.9.0-py3-none-any.whl (8.8 kB) 2025-09-07T07:46:30.5228796Z Collecting marshmallow<4.0.0,>=3.18.0 2025-09-07T07:46:30.5252787Z Downloading marshmallow-3.26.1-py3-none-any.whl (50 kB) 2025-09-07T07:46:30.6565181Z Collecting packaging>=17.0 2025-09-07T07:46:30.6588491Z Downloading packaging-25.0-py3-none-any.whl (66 kB) 2025-09-07T07:46:30.7875450Z Collecting typing-extensions>=3.7.4 2025-09-07T07:46:30.7897694Z Downloading typing_extensions-4.15.0-py3-none-any.whl (44 kB) 2025-09-07T07:46:30.9243768Z Collecting mypy-extensions>=0.3.0 2025-09-07T07:46:30.9264542Z Downloading mypy_extensions-1.1.0-py3-none-any.whl (5.0 kB) 2025-09-07T07:46:31.1437546Z Installing collected packages: typing-extensions, packaging, mypy-extensions, typing-inspect, marshmallow, psutil, nvidia-ml-py, dataclasses-json 2025-09-07T07:46:31.3919073Z Successfully installed dataclasses-json-0.6.7 marshmallow-3.26.1 mypy-extensions-1.1.0 nvidia-ml-py-11.525.84 packaging-25.0 psutil-5.9.8 typing-extensions-4.15.0 typing-inspect-0.9.0 2025-09-07T07:46:31.5354717Z Prepare all required actions 2025-09-07T07:46:31.5354988Z Getting action download info 2025-09-07T07:46:31.6595091Z Download action repository 'seemethere/download-artifact-s3@v4' (SHA:1da556a7aa0a088e3153970611f6c432d58e80e6) 2025-09-07T07:46:31.8802405Z Download action repository 'actions/download-artifact@v4' (SHA:d3f86a106a0bac45b974a628896c90dbdf5c8093) 2025-09-07T07:46:32.2155734Z ##[group]Run ./.github/actions/download-build-artifacts 2025-09-07T07:46:32.2155961Z with: 2025-09-07T07:46:32.2156125Z name: linux-jammy-py3.9-gcc11-build 2025-09-07T07:46:32.2156425Z s3-bucket: gha-artifacts 2025-09-07T07:46:32.2156600Z env: 2025-09-07T07:46:32.2156750Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:32.2156927Z ##[endgroup] 2025-09-07T07:46:32.2364827Z ##[group]Run seemethere/download-artifact-s3@v4 2025-09-07T07:46:32.2365041Z with: 2025-09-07T07:46:32.2365203Z name: linux-jammy-py3.9-gcc11-build 2025-09-07T07:46:32.2365406Z s3-bucket: gha-artifacts 2025-09-07T07:46:32.2365629Z region: us-east-1 2025-09-07T07:46:32.2365774Z env: 2025-09-07T07:46:32.2365913Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:32.2366081Z ##[endgroup] 2025-09-07T07:46:32.8469150Z (node:57329) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-09-07T07:46:32.8469543Z 2025-09-07T07:46:32.8469740Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-09-07T07:46:32.8470081Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-09-07T07:46:32.8470421Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-09-07T07:46:33.4441746Z Found 1 objects with prefix pytorch/pytorch/17525285611/linux-jammy-py3.9-gcc11-build/ 2025-09-07T07:46:33.4442282Z Starting download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-09-07T07:46:38.0550354Z Finished download (1/1): /home/ec2-user/actions-runner/_work/pytorch/pytorch/artifacts.zip 2025-09-07T07:46:38.0552380Z Artifact download has finished successfully 2025-09-07T07:46:38.0764916Z ##[group]Run unzip -o artifacts.zip 2025-09-07T07:46:38.0765145Z unzip -o artifacts.zip 2025-09-07T07:46:38.0769894Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:38.0770113Z env: 2025-09-07T07:46:38.0770258Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:38.0770434Z ##[endgroup] 2025-09-07T07:46:38.1229281Z Archive: artifacts.zip 2025-09-07T07:46:38.1229603Z creating: dist/ 2025-09-07T07:46:39.1360399Z inflating: dist/torch-2.9.0a0+git93fb23d-cp39-cp39-linux_x86_64.whl 2025-09-07T07:46:39.1360838Z creating: dist/vision/ 2025-09-07T07:46:39.1431049Z inflating: dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-09-07T07:46:39.1431388Z creating: dist/audio/ 2025-09-07T07:46:39.1458318Z inflating: dist/audio/torchaudio-2.8.0a0+2e30055-cp39-cp39-linux_x86_64.whl 2025-09-07T07:46:39.1458606Z creating: dist/ao/ 2025-09-07T07:46:39.1493843Z inflating: dist/ao/torchao-0.7.0+git51c87b6e-py3-none-any.whl 2025-09-07T07:46:39.1600578Z inflating: dist/.ninja_log 2025-09-07T07:46:39.1600933Z creating: build/custom_test_artifacts/ 2025-09-07T07:46:39.1601220Z creating: build/custom_test_artifacts/custom-op-build/ 2025-09-07T07:46:39.1601521Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/ 2025-09-07T07:46:39.1601866Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/pkgRedirects/ 2025-09-07T07:46:39.1603789Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeConfigureLog.yaml 2025-09-07T07:46:39.1604236Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/ 2025-09-07T07:46:39.1604608Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-09-07T07:46:39.1605001Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-09-07T07:46:39.1605639Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-09-07T07:46:39.1606276Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-09-07T07:46:39.1607386Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-09-07T07:46:39.1607809Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-09-07T07:46:39.1608218Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-09-07T07:46:39.1608604Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-09-07T07:46:39.1610135Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-09-07T07:46:39.1611100Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-09-07T07:46:39.1611658Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-09-07T07:46:39.1612852Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-09-07T07:46:39.1614081Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-09-07T07:46:39.1614509Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeScratch/ 2025-09-07T07:46:39.1614884Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/cmake.check_cache 2025-09-07T07:46:39.1615265Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/ 2025-09-07T07:46:39.1615690Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.ts 2025-09-07T07:46:39.1616157Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/compiler_depend.make 2025-09-07T07:46:39.1616597Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/depend.make 2025-09-07T07:46:39.1617010Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/link.txt 2025-09-07T07:46:39.1617440Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/cmake_clean.cmake 2025-09-07T07:46:39.1617867Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/build.make 2025-09-07T07:46:39.1618296Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/DependInfo.cmake 2025-09-07T07:46:39.1618725Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/flags.make 2025-09-07T07:46:39.1619145Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/progress.make 2025-09-07T07:46:39.1633814Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o.d 2025-09-07T07:46:39.1799102Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/custom_ops.dir/op.cpp.o 2025-09-07T07:46:39.1799662Z creating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/ 2025-09-07T07:46:39.1800114Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.ts 2025-09-07T07:46:39.1800595Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/compiler_depend.make 2025-09-07T07:46:39.1801065Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/depend.make 2025-09-07T07:46:39.1801494Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/link.txt 2025-09-07T07:46:39.1801942Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/cmake_clean.cmake 2025-09-07T07:46:39.1802389Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/build.make 2025-09-07T07:46:39.1803040Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/DependInfo.cmake 2025-09-07T07:46:39.1803480Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/flags.make 2025-09-07T07:46:39.1803910Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/progress.make 2025-09-07T07:46:39.1816235Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o.d 2025-09-07T07:46:39.1882163Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/test_custom_ops.dir/test_custom_ops.cpp.o 2025-09-07T07:46:39.1882939Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-09-07T07:46:39.1883378Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/TargetDirectories.txt 2025-09-07T07:46:39.1883777Z extracting: build/custom_test_artifacts/custom-op-build/CMakeFiles/progress.marks 2025-09-07T07:46:39.1884151Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile2 2025-09-07T07:46:39.1884518Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/Makefile.cmake 2025-09-07T07:46:39.1884906Z inflating: build/custom_test_artifacts/custom-op-build/CMakeFiles/InstallScripts.json 2025-09-07T07:46:39.1885737Z inflating: build/custom_test_artifacts/custom-op-build/CMakeCache.txt 2025-09-07T07:46:39.1886652Z inflating: build/custom_test_artifacts/custom-op-build/Makefile 2025-09-07T07:46:39.1887062Z inflating: build/custom_test_artifacts/custom-op-build/cmake_install.cmake 2025-09-07T07:46:39.2029764Z inflating: build/custom_test_artifacts/custom-op-build/libcustom_ops.so 2025-09-07T07:46:39.2076350Z inflating: build/custom_test_artifacts/custom-op-build/test_custom_ops 2025-09-07T07:46:39.2076659Z creating: build/custom_test_artifacts/jit-hook-build/ 2025-09-07T07:46:39.2076947Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/ 2025-09-07T07:46:39.2077288Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/pkgRedirects/ 2025-09-07T07:46:39.2079841Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeConfigureLog.yaml 2025-09-07T07:46:39.2080205Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/ 2025-09-07T07:46:39.2080565Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-09-07T07:46:39.2080996Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-09-07T07:46:39.2081381Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-09-07T07:46:39.2084484Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-09-07T07:46:39.2085035Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-09-07T07:46:39.2085465Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-09-07T07:46:39.2085891Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-09-07T07:46:39.2086282Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-09-07T07:46:39.2086747Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-09-07T07:46:39.2087473Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-09-07T07:46:39.2087953Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-09-07T07:46:39.2088898Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-09-07T07:46:39.2090001Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-09-07T07:46:39.2090432Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeScratch/ 2025-09-07T07:46:39.2090998Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/cmake.check_cache 2025-09-07T07:46:39.2091387Z creating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/ 2025-09-07T07:46:39.2091805Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.ts 2025-09-07T07:46:39.2092261Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/compiler_depend.make 2025-09-07T07:46:39.2092711Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/depend.make 2025-09-07T07:46:39.2093206Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/link.txt 2025-09-07T07:46:39.2093637Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/cmake_clean.cmake 2025-09-07T07:46:39.2094076Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/build.make 2025-09-07T07:46:39.2094503Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/DependInfo.cmake 2025-09-07T07:46:39.2094939Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/flags.make 2025-09-07T07:46:39.2095371Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/progress.make 2025-09-07T07:46:39.2109672Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o.d 2025-09-07T07:46:39.2160846Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/test_jit_hooks.dir/test_jit_hooks.cpp.o 2025-09-07T07:46:39.2161315Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-09-07T07:46:39.2161753Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/TargetDirectories.txt 2025-09-07T07:46:39.2162138Z extracting: build/custom_test_artifacts/jit-hook-build/CMakeFiles/progress.marks 2025-09-07T07:46:39.2162503Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile2 2025-09-07T07:46:39.2163040Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/Makefile.cmake 2025-09-07T07:46:39.2163420Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeFiles/InstallScripts.json 2025-09-07T07:46:39.2164487Z inflating: build/custom_test_artifacts/jit-hook-build/CMakeCache.txt 2025-09-07T07:46:39.2164993Z inflating: build/custom_test_artifacts/jit-hook-build/Makefile 2025-09-07T07:46:39.2165356Z inflating: build/custom_test_artifacts/jit-hook-build/cmake_install.cmake 2025-09-07T07:46:39.2197232Z inflating: build/custom_test_artifacts/jit-hook-build/test_jit_hooks 2025-09-07T07:46:39.2197537Z creating: build/custom_test_artifacts/custom-backend-build/ 2025-09-07T07:46:39.2197842Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/ 2025-09-07T07:46:39.2198198Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/pkgRedirects/ 2025-09-07T07:46:39.2200657Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeConfigureLog.yaml 2025-09-07T07:46:39.2201051Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/ 2025-09-07T07:46:39.2201432Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeSystem.cmake 2025-09-07T07:46:39.2201842Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/ 2025-09-07T07:46:39.2202237Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/tmp/ 2025-09-07T07:46:39.2203254Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/CMakeCCompilerId.c 2025-09-07T07:46:39.2204255Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdC/a.out 2025-09-07T07:46:39.2204770Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeCCompiler.cmake 2025-09-07T07:46:39.2205196Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/ 2025-09-07T07:46:39.2205602Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/tmp/ 2025-09-07T07:46:39.2206883Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/CMakeCXXCompilerId.cpp 2025-09-07T07:46:39.2207949Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CompilerIdCXX/a.out 2025-09-07T07:46:39.2208594Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeCXXCompiler.cmake 2025-09-07T07:46:39.2209716Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_C.bin 2025-09-07T07:46:39.2210801Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/4.0.0/CMakeDetermineCompilerABI_CXX.bin 2025-09-07T07:46:39.2211251Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeScratch/ 2025-09-07T07:46:39.2211621Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/cmake.check_cache 2025-09-07T07:46:39.2212021Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/ 2025-09-07T07:46:39.2212459Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.ts 2025-09-07T07:46:39.2212953Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/compiler_depend.make 2025-09-07T07:46:39.2213428Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/depend.make 2025-09-07T07:46:39.2213873Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/link.txt 2025-09-07T07:46:39.2214340Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/cmake_clean.cmake 2025-09-07T07:46:39.2214804Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/build.make 2025-09-07T07:46:39.2215265Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/DependInfo.cmake 2025-09-07T07:46:39.2215724Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/flags.make 2025-09-07T07:46:39.2216169Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/progress.make 2025-09-07T07:46:39.2217388Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o.d 2025-09-07T07:46:39.2314522Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/custom_backend.dir/custom_backend.cpp.o 2025-09-07T07:46:39.2314989Z creating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/ 2025-09-07T07:46:39.2315456Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.ts 2025-09-07T07:46:39.2315979Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/compiler_depend.make 2025-09-07T07:46:39.2316475Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/depend.make 2025-09-07T07:46:39.2316944Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/link.txt 2025-09-07T07:46:39.2317427Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/cmake_clean.cmake 2025-09-07T07:46:39.2317921Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/build.make 2025-09-07T07:46:39.2318407Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/DependInfo.cmake 2025-09-07T07:46:39.2318949Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/flags.make 2025-09-07T07:46:39.2319422Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/progress.make 2025-09-07T07:46:39.2334216Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o.d 2025-09-07T07:46:39.2378176Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/test_custom_backend.dir/test_custom_backend.cpp.o 2025-09-07T07:46:39.2378695Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/CMakeDirectoryInformation.cmake 2025-09-07T07:46:39.2379326Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/TargetDirectories.txt 2025-09-07T07:46:39.2379746Z extracting: build/custom_test_artifacts/custom-backend-build/CMakeFiles/progress.marks 2025-09-07T07:46:39.2380374Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile2 2025-09-07T07:46:39.2381102Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/Makefile.cmake 2025-09-07T07:46:39.2381543Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeFiles/InstallScripts.json 2025-09-07T07:46:39.2382498Z inflating: build/custom_test_artifacts/custom-backend-build/CMakeCache.txt 2025-09-07T07:46:39.2383073Z inflating: build/custom_test_artifacts/custom-backend-build/Makefile 2025-09-07T07:46:39.2383512Z inflating: build/custom_test_artifacts/custom-backend-build/cmake_install.cmake 2025-09-07T07:46:39.2466832Z inflating: build/custom_test_artifacts/custom-backend-build/libcustom_backend.so 2025-09-07T07:46:39.2499347Z inflating: build/custom_test_artifacts/custom-backend-build/test_custom_backend 2025-09-07T07:46:39.2501133Z creating: build/lib/ 2025-09-07T07:46:39.2568887Z inflating: build/lib/libprotobuf-lite.a 2025-09-07T07:46:39.2945148Z inflating: build/lib/libprotobuf.a 2025-09-07T07:46:39.3369510Z inflating: build/lib/libprotoc.a 2025-09-07T07:46:39.3375816Z inflating: build/lib/libpthreadpool.a 2025-09-07T07:46:39.3382355Z inflating: build/lib/libcpuinfo.a 2025-09-07T07:46:39.3388917Z inflating: build/lib/libcpuinfo_internals.a 2025-09-07T07:46:39.3389588Z inflating: build/lib/libclog.a 2025-09-07T07:46:39.3405600Z inflating: build/lib/libpytorch_qnnpack.a 2025-09-07T07:46:39.3407330Z inflating: build/lib/libnnpack_reference_layers.a 2025-09-07T07:46:39.3566027Z inflating: build/lib/libmicrokernels-prod.a 2025-09-07T07:46:39.3581102Z inflating: build/lib/libnnpack.a 2025-09-07T07:46:39.4330483Z inflating: build/lib/libmicrokernels-all.a 2025-09-07T07:46:39.4387142Z inflating: build/lib/libgtest.a 2025-09-07T07:46:39.4401351Z inflating: build/lib/libgmock.a 2025-09-07T07:46:39.4401827Z inflating: build/lib/libgtest_main.a 2025-09-07T07:46:39.4402276Z inflating: build/lib/libgmock_main.a 2025-09-07T07:46:39.4479021Z inflating: build/lib/libXNNPACK.a 2025-09-07T07:46:39.4543183Z inflating: build/lib/libbenchmark.a 2025-09-07T07:46:39.4543656Z inflating: build/lib/libbenchmark_main.a 2025-09-07T07:46:39.4544205Z inflating: build/lib/libjitprofiling.a 2025-09-07T07:46:39.4550725Z inflating: build/lib/libittnotify.a 2025-09-07T07:46:39.4606818Z inflating: build/lib/libasmjit.a 2025-09-07T07:46:39.5586871Z inflating: build/lib/libfbgemm.a 2025-09-07T07:46:39.5610736Z inflating: build/lib/libtensorpipe_uv.a 2025-09-07T07:46:39.6076146Z inflating: build/lib/libtensorpipe.a 2025-09-07T07:46:39.6176009Z inflating: build/lib/libgloo.a 2025-09-07T07:46:39.6216455Z inflating: build/lib/libonnx_proto.a 2025-09-07T07:46:39.6818095Z inflating: build/lib/libonnx.a 2025-09-07T07:46:40.5335074Z inflating: build/lib/libdnnl.a 2025-09-07T07:46:40.5349638Z inflating: build/lib/libfmt.a 2025-09-07T07:46:40.5575243Z inflating: build/lib/libkineto.a 2025-09-07T07:46:40.5668875Z inflating: build/lib/libc10.so 2025-09-07T07:46:40.5670235Z inflating: build/lib/libtorch_global_deps.so 2025-09-07T07:46:43.1126955Z inflating: build/lib/libtorch_cpu.so 2025-09-07T07:46:43.1127326Z inflating: build/lib/libtorch.so 2025-09-07T07:46:43.1186666Z inflating: build/lib/libtorchbind_test.so 2025-09-07T07:46:43.1202569Z inflating: build/lib/libjitbackend_test.so 2025-09-07T07:46:43.1223154Z inflating: build/lib/libbackend_with_compiler.so 2025-09-07T07:46:43.1245349Z inflating: build/lib/libaoti_custom_ops.so 2025-09-07T07:46:43.1248500Z inflating: build/lib/libshm.so 2025-09-07T07:46:43.3004132Z inflating: build/lib/libtorch_python.so 2025-09-07T07:46:43.3032870Z inflating: build/lib/libnnapi_backend.so 2025-09-07T07:46:43.3033129Z creating: build/bin/ 2025-09-07T07:46:43.3033309Z creating: build/bin/CMakeFiles/ 2025-09-07T07:46:43.3033609Z inflating: build/bin/cmake_install.cmake 2025-09-07T07:46:43.3033861Z inflating: build/bin/CTestTestfile.cmake 2025-09-07T07:46:43.3427432Z inflating: build/bin/protoc-3.13.0.0 2025-09-07T07:46:43.3819921Z inflating: build/bin/protoc 2025-09-07T07:46:43.3869777Z inflating: build/bin/c10_AllocatorConfig_test 2025-09-07T07:46:43.3917154Z inflating: build/bin/c10_CompileTimeFunctionPointer_test 2025-09-07T07:46:43.3966116Z inflating: build/bin/c10_DeviceGuard_test 2025-09-07T07:46:43.4015476Z inflating: build/bin/c10_Device_test 2025-09-07T07:46:43.4071673Z inflating: build/bin/c10_DispatchKeySet_test 2025-09-07T07:46:43.4118500Z inflating: build/bin/c10_StreamGuard_test 2025-09-07T07:46:43.4169760Z inflating: build/bin/c10_Scalar_test 2025-09-07T07:46:43.4223824Z inflating: build/bin/c10_SymInt_test 2025-09-07T07:46:43.4275337Z inflating: build/bin/c10_InlineDeviceGuard_test 2025-09-07T07:46:43.4328577Z inflating: build/bin/c10_SizesAndStrides_test 2025-09-07T07:46:43.4381457Z inflating: build/bin/c10_InlineStreamGuard_test 2025-09-07T07:46:43.4428791Z inflating: build/bin/c10_ArrayRef_test 2025-09-07T07:46:43.4475443Z inflating: build/bin/c10_ConstexprCrc_test 2025-09-07T07:46:43.4541256Z inflating: build/bin/c10_cow_test 2025-09-07T07:46:43.4591800Z inflating: build/bin/c10_Bitset_test 2025-09-07T07:46:43.4645731Z inflating: build/bin/c10_Enumerate_test 2025-09-07T07:46:43.4693260Z inflating: build/bin/c10_DeadlockDetection_test 2025-09-07T07:46:43.4743495Z inflating: build/bin/c10_IntrusiveList_test 2025-09-07T07:46:43.4796303Z inflating: build/bin/c10_LeftRight_test 2025-09-07T07:46:43.4844618Z inflating: build/bin/c10_Half_test 2025-09-07T07:46:43.4897072Z inflating: build/bin/c10_Metaprogramming_test 2025-09-07T07:46:43.4947649Z inflating: build/bin/c10_NetworkFlow_test 2025-09-07T07:46:43.4995068Z inflating: build/bin/c10_Semaphore_test 2025-09-07T07:46:43.5043076Z inflating: build/bin/c10_Synchronized_test 2025-09-07T07:46:43.5092118Z inflating: build/bin/c10_TypeIndex_test 2025-09-07T07:46:43.5144604Z inflating: build/bin/c10_ThreadLocal_test 2025-09-07T07:46:43.5193277Z inflating: build/bin/c10_TypeList_test 2025-09-07T07:46:43.5239988Z inflating: build/bin/c10_TypeTraits_test 2025-09-07T07:46:43.5289000Z inflating: build/bin/c10_accumulate_test 2025-09-07T07:46:43.5341952Z inflating: build/bin/c10_bfloat16_test 2025-09-07T07:46:43.5389879Z inflating: build/bin/c10_bit_cast_test 2025-09-07T07:46:43.5443245Z inflating: build/bin/c10_complex_math_test 2025-09-07T07:46:43.5495698Z inflating: build/bin/c10_complex_test 2025-09-07T07:46:43.5542807Z inflating: build/bin/c10_error_test 2025-09-07T07:46:43.5592481Z inflating: build/bin/c10_exception_test 2025-09-07T07:46:43.5640258Z inflating: build/bin/c10_flags_test 2025-09-07T07:46:43.5688297Z inflating: build/bin/c10_generic_math_test 2025-09-07T07:46:43.5736731Z inflating: build/bin/c10_irange_test 2025-09-07T07:46:43.5885524Z inflating: build/bin/c10_intrusive_ptr_test 2025-09-07T07:46:43.5933693Z inflating: build/bin/c10_lazy_test 2025-09-07T07:46:43.5987776Z inflating: build/bin/c10_logging_test 2025-09-07T07:46:43.6057142Z inflating: build/bin/c10_optional_test 2025-09-07T07:46:43.6115465Z inflating: build/bin/c10_ordered_preserving_dict_test 2025-09-07T07:46:43.6166008Z inflating: build/bin/c10_registry_test 2025-09-07T07:46:43.6305464Z inflating: build/bin/c10_small_vector_test 2025-09-07T07:46:43.6352606Z inflating: build/bin/c10_ssize_test 2025-09-07T07:46:43.6405785Z inflating: build/bin/c10_string_util_test 2025-09-07T07:46:43.6452604Z inflating: build/bin/c10_string_view_test 2025-09-07T07:46:43.6500386Z inflating: build/bin/c10_tempfile_test 2025-09-07T07:46:43.6553647Z inflating: build/bin/c10_typeid_test 2025-09-07T07:46:43.6595393Z inflating: build/bin/c10_intrusive_ptr_benchmark 2025-09-07T07:46:43.7106143Z inflating: build/bin/vec_test_all_types_DEFAULT 2025-09-07T07:46:43.7632676Z inflating: build/bin/vec_test_all_types_AVX512 2025-09-07T07:46:43.8165872Z inflating: build/bin/vec_test_all_types_AVX2 2025-09-07T07:46:43.8214578Z inflating: build/bin/static_runtime_bench 2025-09-07T07:46:43.8438126Z inflating: build/bin/static_runtime_test 2025-09-07T07:46:43.8504845Z inflating: build/bin/Dict_test 2025-09-07T07:46:43.8554147Z inflating: build/bin/Dimname_test 2025-09-07T07:46:43.8615131Z inflating: build/bin/MaybeOwned_test 2025-09-07T07:46:43.8668721Z inflating: build/bin/NamedTensor_test 2025-09-07T07:46:43.8724252Z inflating: build/bin/apply_utils_test 2025-09-07T07:46:43.8779701Z inflating: build/bin/atest 2025-09-07T07:46:43.8839878Z inflating: build/bin/basic 2025-09-07T07:46:43.8892451Z inflating: build/bin/broadcast_test 2025-09-07T07:46:43.8940857Z inflating: build/bin/cpu_allocator_test 2025-09-07T07:46:43.8995826Z inflating: build/bin/cpu_generator_test 2025-09-07T07:46:43.9046295Z inflating: build/bin/cpu_profiling_allocator_test 2025-09-07T07:46:43.9130792Z inflating: build/bin/cpu_rng_test 2025-09-07T07:46:43.9179155Z inflating: build/bin/dlconvertor_test 2025-09-07T07:46:43.9233397Z inflating: build/bin/extension_backend_test 2025-09-07T07:46:43.9285988Z inflating: build/bin/half_test 2025-09-07T07:46:43.9374604Z inflating: build/bin/ivalue_test 2025-09-07T07:46:43.9421902Z inflating: build/bin/lazy_tensor_test 2025-09-07T07:46:43.9472627Z inflating: build/bin/math_kernel_test 2025-09-07T07:46:43.9523510Z inflating: build/bin/memory_format_test 2025-09-07T07:46:43.9574312Z inflating: build/bin/memory_overlapping_test 2025-09-07T07:46:43.9624829Z inflating: build/bin/mobile_memory_cleanup 2025-09-07T07:46:43.9678116Z inflating: build/bin/native_test 2025-09-07T07:46:43.9726591Z inflating: build/bin/operator_name_test 2025-09-07T07:46:43.9774852Z inflating: build/bin/operators_test 2025-09-07T07:46:43.9824304Z inflating: build/bin/packedtensoraccessor_test 2025-09-07T07:46:43.9887132Z inflating: build/bin/pow_test 2025-09-07T07:46:43.9941438Z inflating: build/bin/quantized_test 2025-09-07T07:46:43.9989051Z inflating: build/bin/reduce_ops_test 2025-09-07T07:46:44.0037518Z inflating: build/bin/reportMemoryUsage_test 2025-09-07T07:46:44.0091099Z inflating: build/bin/scalar_tensor_test 2025-09-07T07:46:44.0146412Z inflating: build/bin/scalar_test 2025-09-07T07:46:44.0195770Z inflating: build/bin/StorageUtils_test 2025-09-07T07:46:44.0245145Z inflating: build/bin/stride_properties_test 2025-09-07T07:46:44.0317741Z inflating: build/bin/tensor_iterator_test 2025-09-07T07:46:44.0369249Z inflating: build/bin/test_parallel 2025-09-07T07:46:44.0417470Z inflating: build/bin/thread_init_test 2025-09-07T07:46:44.0469662Z inflating: build/bin/type_ptr_test 2025-09-07T07:46:44.0525343Z inflating: build/bin/type_test 2025-09-07T07:46:44.0575494Z inflating: build/bin/undefined_tensor_test 2025-09-07T07:46:44.0622643Z inflating: build/bin/verify_api_visibility 2025-09-07T07:46:44.0688098Z inflating: build/bin/legacy_vmap_test 2025-09-07T07:46:44.0736438Z inflating: build/bin/weakref_test 2025-09-07T07:46:44.0785501Z inflating: build/bin/wrapdim_test 2025-09-07T07:46:44.0834356Z inflating: build/bin/xla_tensor_test 2025-09-07T07:46:44.0890728Z inflating: build/bin/IListRef_test 2025-09-07T07:46:44.0986581Z inflating: build/bin/List_test 2025-09-07T07:46:44.1048164Z inflating: build/bin/KernelFunction_test 2025-09-07T07:46:44.1156999Z inflating: build/bin/kernel_function_legacy_test 2025-09-07T07:46:44.1244038Z inflating: build/bin/kernel_function_test 2025-09-07T07:46:44.1357431Z inflating: build/bin/kernel_lambda_legacy_test 2025-09-07T07:46:44.1449682Z inflating: build/bin/kernel_lambda_test 2025-09-07T07:46:44.1506497Z inflating: build/bin/kernel_stackbased_test 2025-09-07T07:46:44.1593590Z inflating: build/bin/make_boxed_from_unboxed_functor_test 2025-09-07T07:46:44.1642325Z inflating: build/bin/CppSignature_test 2025-09-07T07:46:44.1694664Z inflating: build/bin/backend_fallback_test 2025-09-07T07:46:44.1967941Z inflating: build/bin/op_registration_test 2025-09-07T07:46:44.2012756Z inflating: build/bin/op_allowlist_test 2025-09-07T07:46:44.2075071Z inflating: build/bin/inline_container_test 2025-09-07T07:46:44.3039106Z inflating: build/bin/test_jit 2025-09-07T07:46:44.3088683Z inflating: build/bin/FileStoreTest 2025-09-07T07:46:44.3138195Z inflating: build/bin/BackoffTest 2025-09-07T07:46:44.3192187Z inflating: build/bin/TCPStoreTest 2025-09-07T07:46:44.3525956Z inflating: build/bin/test_nativert 2025-09-07T07:46:44.3574947Z inflating: build/bin/HashStoreTest 2025-09-07T07:46:44.3577138Z inflating: build/bin/example_allreduce 2025-09-07T07:46:44.3639069Z inflating: build/bin/ProcessGroupGlooTest 2025-09-07T07:46:44.3691138Z inflating: build/bin/test_dist_autograd 2025-09-07T07:46:44.3754413Z inflating: build/bin/test_cpp_rpc 2025-09-07T07:46:44.4745936Z inflating: build/bin/test_api 2025-09-07T07:46:44.4746303Z inflating: build/bin/parallel_benchmark 2025-09-07T07:46:44.5046563Z inflating: build/bin/test_lazy 2025-09-07T07:46:44.5047906Z inflating: build/bin/torch_shm_manager 2025-09-07T07:46:44.5048152Z creating: .additional_ci_files/ 2025-09-07T07:46:44.5123858Z inflating: .additional_ci_files/test-times.json 2025-09-07T07:46:44.5415989Z inflating: .additional_ci_files/test-class-times.json 2025-09-07T07:46:44.5463461Z ##[group]Run rm artifacts.zip 2025-09-07T07:46:44.5463674Z rm artifacts.zip 2025-09-07T07:46:44.5468497Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:44.5468727Z env: 2025-09-07T07:46:44.5468875Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:44.5469048Z ##[endgroup] 2025-09-07T07:46:44.5787900Z ##[group]Run df -H 2025-09-07T07:46:44.5788072Z df -H 2025-09-07T07:46:44.5791742Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:44.5791977Z env: 2025-09-07T07:46:44.5792135Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:44.5792309Z ##[endgroup] 2025-09-07T07:46:44.5824803Z Filesystem Size Used Avail Use% Mounted on 2025-09-07T07:46:44.5825079Z devtmpfs 4.2M 0 4.2M 0% /dev 2025-09-07T07:46:44.5825292Z tmpfs 102G 0 102G 0% /dev/shm 2025-09-07T07:46:44.5825504Z tmpfs 41G 11M 41G 1% /run 2025-09-07T07:46:44.5825709Z /dev/nvme0n1p1 215G 72G 144G 34% / 2025-09-07T07:46:44.5825917Z tmpfs 102G 13k 102G 1% /tmp 2025-09-07T07:46:44.5826165Z /dev/nvme0n1p128 11M 1.4M 9.2M 13% /boot/efi 2025-09-07T07:46:44.5847111Z Prepare all required actions 2025-09-07T07:46:44.5847553Z Getting action download info 2025-09-07T07:46:44.7325718Z ##[group]Run ./.github/actions/download-td-artifacts 2025-09-07T07:46:44.7325941Z with: 2025-09-07T07:46:44.7326082Z env: 2025-09-07T07:46:44.7326228Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:44.7326388Z ##[endgroup] 2025-09-07T07:46:44.7451359Z ##[group]Run seemethere/download-artifact-s3@v4 2025-09-07T07:46:44.7451571Z with: 2025-09-07T07:46:44.7451704Z name: td_results 2025-09-07T07:46:44.7451862Z s3-bucket: gha-artifacts 2025-09-07T07:46:44.7452031Z region: us-east-1 2025-09-07T07:46:44.7452175Z env: 2025-09-07T07:46:44.7452305Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:44.7452467Z ##[endgroup] 2025-09-07T07:46:45.0757141Z (node:57362) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023. 2025-09-07T07:46:45.0757420Z 2025-09-07T07:46:45.0757589Z Please migrate your code to use AWS SDK for JavaScript (v3). 2025-09-07T07:46:45.0758125Z For more information, check the migration guide at https://a.co/7PzMCcy 2025-09-07T07:46:45.0758442Z (Use `node --trace-warnings ...` to show where the warning was created) 2025-09-07T07:46:45.1594365Z Found 0 objects with prefix pytorch/pytorch/17525285611/td_results/ 2025-09-07T07:46:45.1596889Z Artifact download has finished successfully 2025-09-07T07:46:45.1793553Z ##[group]Run mkdir -p .additional_ci_files 2025-09-07T07:46:45.1793787Z mkdir -p .additional_ci_files 2025-09-07T07:46:45.1794050Z mv td_results.json .additional_ci_files/td_results.json || true 2025-09-07T07:46:45.1798876Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:45.1799094Z env: 2025-09-07T07:46:45.1799243Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:45.1799413Z ##[endgroup] 2025-09-07T07:46:45.1839526Z mv: cannot stat 'td_results.json': No such file or directory 2025-09-07T07:46:45.1928231Z ##[group]Run .github/scripts/parse_ref.py 2025-09-07T07:46:45.1928474Z .github/scripts/parse_ref.py 2025-09-07T07:46:45.1932458Z shell: /usr/bin/bash -e {0} 2025-09-07T07:46:45.1932625Z env: 2025-09-07T07:46:45.1932778Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:45.1932951Z ##[endgroup] 2025-09-07T07:46:45.2589494Z Setting output branch=main 2025-09-07T07:46:45.2664632Z Prepare all required actions 2025-09-07T07:46:45.2664897Z Getting action download info 2025-09-07T07:46:45.3882648Z ##[group]Run ./.github/actions/filter-test-configs 2025-09-07T07:46:45.3882876Z with: 2025-09-07T07:46:45.3883182Z github-token: *** 2025-09-07T07:46:45.3885302Z test-matrix: {"include": [{"config": "inductor_huggingface_perf_cpu_x86", "shard": 1, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 2, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 3, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 1, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 2, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 3, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 4, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 5, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 1, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 2, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 3, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 4, "num_shards": 4, "runner": "linux.24xl.spr-metal"}]} 2025-09-07T07:46:45.3887695Z job-name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:45.3888039Z env: 2025-09-07T07:46:45.3888190Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:45.3888354Z ##[endgroup] 2025-09-07T07:46:45.3949425Z ##[group]Run nick-fields/retry@v3.0.0 2025-09-07T07:46:45.3949625Z with: 2025-09-07T07:46:45.3949770Z shell: bash 2025-09-07T07:46:45.3949924Z timeout_minutes: 10 2025-09-07T07:46:45.3950083Z max_attempts: 5 2025-09-07T07:46:45.3950235Z retry_wait_seconds: 30 2025-09-07T07:46:45.3950671Z command: set -eux # PyYAML 6.0 doesn't work with MacOS x86 anymore # This must run on Python-3.7 (AmazonLinux2) so can't use request=3.32.2 python3 -m pip install requests==2.27.1 pyyaml==6.0.2 2025-09-07T07:46:45.3951116Z polling_interval_seconds: 1 2025-09-07T07:46:45.3951287Z warning_on_retry: true 2025-09-07T07:46:45.3951446Z continue_on_error: false 2025-09-07T07:46:45.3951607Z env: 2025-09-07T07:46:45.3951737Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:45.3952023Z GITHUB_TOKEN: *** 2025-09-07T07:46:45.3952268Z ##[endgroup] 2025-09-07T07:46:45.5549700Z + python3 -m pip install requests==2.27.1 pyyaml==6.0.2 2025-09-07T07:46:45.7217790Z Defaulting to user installation because normal site-packages is not writeable 2025-09-07T07:46:45.8380037Z Collecting requests==2.27.1 2025-09-07T07:46:45.8495746Z Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) 2025-09-07T07:46:46.0183167Z Collecting pyyaml==6.0.2 2025-09-07T07:46:46.0206107Z Downloading PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (737 kB) 2025-09-07T07:46:46.3787234Z Collecting charset-normalizer~=2.0.0 2025-09-07T07:46:46.3811161Z Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) 2025-09-07T07:46:46.5158701Z Collecting certifi>=2017.4.17 2025-09-07T07:46:46.5181852Z Downloading certifi-2025.8.3-py3-none-any.whl (161 kB) 2025-09-07T07:46:46.5580492Z Requirement already satisfied: idna<4,>=2.5 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (2.10) 2025-09-07T07:46:46.5581102Z Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/lib/python3.9/site-packages (from requests==2.27.1) (1.25.10) 2025-09-07T07:46:46.6168052Z Installing collected packages: charset-normalizer, certifi, requests, pyyaml 2025-09-07T07:46:47.0611796Z Successfully installed certifi-2025.8.3 charset-normalizer-2.0.12 pyyaml-6.0.2 requests-2.27.1 2025-09-07T07:46:47.4515994Z Command completed after 1 attempt(s). 2025-09-07T07:46:47.4559919Z ##[group]Run set -x 2025-09-07T07:46:47.4560096Z set -x 2025-09-07T07:46:47.4560243Z  2025-09-07T07:46:47.4560469Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-09-07T07:46:47.4560745Z # in runner workspace 2025-09-07T07:46:47.4560983Z python3 "${GITHUB_ACTION_PATH}/../../scripts/parse_ref.py" 2025-09-07T07:46:47.4565903Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:47.4566124Z env: 2025-09-07T07:46:47.4566263Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.4566447Z ##[endgroup] 2025-09-07T07:46:47.4584110Z + python3 /home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/filter-test-configs/../../scripts/parse_ref.py 2025-09-07T07:46:47.4714413Z Setting output branch=main 2025-09-07T07:46:47.4767080Z ##[group]Run echo "Workflow: ${GITHUB_WORKFLOW}" 2025-09-07T07:46:47.4767350Z echo "Workflow: ${GITHUB_WORKFLOW}" 2025-09-07T07:46:47.4767559Z echo "Job name: ${JOB_NAME}" 2025-09-07T07:46:47.4767743Z  2025-09-07T07:46:47.4767966Z # Use relative path here as this could be checked out anywhere, not necessarily 2025-09-07T07:46:47.4768240Z # in runner workspace 2025-09-07T07:46:47.4768517Z python3 "${GITHUB_ACTION_PATH}/../../scripts/filter_test_configs.py" \ 2025-09-07T07:46:47.4768801Z  --workflow "${GITHUB_WORKFLOW}" \ 2025-09-07T07:46:47.4769013Z  --job-name "${JOB_NAME}" \ 2025-09-07T07:46:47.4771133Z  --test-matrix "{"include": [{"config": "inductor_huggingface_perf_cpu_x86", "shard": 1, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 2, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 3, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 1, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 2, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 3, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 4, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 5, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 1, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 2, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 3, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 4, "num_shards": 4, "runner": "linux.24xl.spr-metal"}]}" \ 2025-09-07T07:46:47.4773501Z  --selected-test-configs "" \ 2025-09-07T07:46:47.4773718Z  --pr-number "${PR_NUMBER}" \ 2025-09-07T07:46:47.4773917Z  --tag "${TAG}" \ 2025-09-07T07:46:47.4774100Z  --event-name "${EVENT_NAME}" \ 2025-09-07T07:46:47.4774303Z  --schedule "${SCHEDULE}" \ 2025-09-07T07:46:47.4774503Z  --branch "${HEAD_BRANCH}" 2025-09-07T07:46:47.4778205Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:47.4778428Z env: 2025-09-07T07:46:47.4778564Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.4778916Z GITHUB_TOKEN: *** 2025-09-07T07:46:47.4779242Z JOB_NAME: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:47.4779595Z PR_NUMBER: 2025-09-07T07:46:47.4779742Z TAG: 2025-09-07T07:46:47.4779873Z EVENT_NAME: schedule 2025-09-07T07:46:47.4780035Z SCHEDULE: 0 7 * * * 2025-09-07T07:46:47.4780193Z HEAD_BRANCH: main 2025-09-07T07:46:47.4780341Z ##[endgroup] 2025-09-07T07:46:47.4796392Z Workflow: inductor-perf-nightly-x86 2025-09-07T07:46:47.4796757Z Job name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:47.6254874Z Setting output keep-going=True 2025-09-07T07:46:47.6255236Z Setting output ci-verbose-test-logs=False 2025-09-07T07:46:47.6255515Z Setting output ci-test-showlocals=False 2025-09-07T07:46:47.6255745Z Setting output ci-no-test-timeout=False 2025-09-07T07:46:47.6255964Z Setting output ci-no-td=False 2025-09-07T07:46:47.6256162Z Setting output ci-td-distributed=False 2025-09-07T07:46:47.6256377Z Setting output is-unstable=False 2025-09-07T07:46:47.6256583Z Setting output reenabled-issues= 2025-09-07T07:46:47.6258768Z Setting output test-matrix={"include": [{"config": "inductor_huggingface_perf_cpu_x86", "shard": 1, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 2, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 3, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 1, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 2, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 3, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 4, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 5, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 1, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 2, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 3, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 4, "num_shards": 4, "runner": "linux.24xl.spr-metal"}]} 2025-09-07T07:46:47.6261064Z Setting output is-test-matrix-empty=False 2025-09-07T07:46:47.6401100Z ##[group]Run echo "Filtered matrix:" 2025-09-07T07:46:47.6401324Z echo "Filtered matrix:" 2025-09-07T07:46:47.6403440Z echo "{"include": [{"config": "inductor_huggingface_perf_cpu_x86", "shard": 1, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 2, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_huggingface_perf_cpu_x86", "shard": 3, "num_shards": 3, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 1, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 2, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 3, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 4, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_timm_perf_cpu_x86", "shard": 5, "num_shards": 5, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 1, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 2, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 3, "num_shards": 4, "runner": "linux.24xl.spr-metal"}, {"config": "inductor_torchbench_perf_cpu_x86", "shard": 4, "num_shards": 4, "runner": "linux.24xl.spr-metal"}]}" 2025-09-07T07:46:47.6405762Z  2025-09-07T07:46:47.6405912Z echo 2025-09-07T07:46:47.6406101Z echo "Is the current job unstable? False" 2025-09-07T07:46:47.6406340Z  2025-09-07T07:46:47.6406496Z echo 2025-09-07T07:46:47.6406686Z echo "Is keep-going label set? True" 2025-09-07T07:46:47.6406904Z  2025-09-07T07:46:47.6407052Z echo 2025-09-07T07:46:47.6407274Z echo "Reenabled issues? " 2025-09-07T07:46:47.6410751Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:47.6410972Z env: 2025-09-07T07:46:47.6411119Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.6411292Z ##[endgroup] 2025-09-07T07:46:47.6427497Z Filtered matrix: 2025-09-07T07:46:47.6430003Z {include: [{config: inductor_huggingface_perf_cpu_x86, shard: 1, num_shards: 3, runner: linux.24xl.spr-metal}, {config: inductor_huggingface_perf_cpu_x86, shard: 2, num_shards: 3, runner: linux.24xl.spr-metal}, {config: inductor_huggingface_perf_cpu_x86, shard: 3, num_shards: 3, runner: linux.24xl.spr-metal}, {config: inductor_timm_perf_cpu_x86, shard: 1, num_shards: 5, runner: linux.24xl.spr-metal}, {config: inductor_timm_perf_cpu_x86, shard: 2, num_shards: 5, runner: linux.24xl.spr-metal}, {config: inductor_timm_perf_cpu_x86, shard: 3, num_shards: 5, runner: linux.24xl.spr-metal}, {config: inductor_timm_perf_cpu_x86, shard: 4, num_shards: 5, runner: linux.24xl.spr-metal}, {config: inductor_timm_perf_cpu_x86, shard: 5, num_shards: 5, runner: linux.24xl.spr-metal}, {config: inductor_torchbench_perf_cpu_x86, shard: 1, num_shards: 4, runner: linux.24xl.spr-metal}, {config: inductor_torchbench_perf_cpu_x86, shard: 2, num_shards: 4, runner: linux.24xl.spr-metal}, {config: inductor_torchbench_perf_cpu_x86, shard: 3, num_shards: 4, runner: linux.24xl.spr-metal}, {config: inductor_torchbench_perf_cpu_x86, shard: 4, num_shards: 4, runner: linux.24xl.spr-metal}]} 2025-09-07T07:46:47.6432180Z 2025-09-07T07:46:47.6432270Z Is the current job unstable? False 2025-09-07T07:46:47.6432409Z 2025-09-07T07:46:47.6432484Z Is keep-going label set? True 2025-09-07T07:46:47.6432607Z 2025-09-07T07:46:47.6432676Z Reenabled issues? 2025-09-07T07:46:47.6583264Z ##[group]Run echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-09-07T07:46:47.6583581Z echo "timeout=$((JOB_TIMEOUT-30))" >> "${GITHUB_OUTPUT}" 2025-09-07T07:46:47.6587479Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:47.6587708Z env: 2025-09-07T07:46:47.6587853Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.6588021Z JOB_TIMEOUT: 720 2025-09-07T07:46:47.6588169Z ##[endgroup] 2025-09-07T07:46:47.6733630Z ##[group]Run env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:46:47.6733945Z env | grep '^GITHUB' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:46:47.6734204Z env | grep '^CI' >> "/tmp/github_env_${GITHUB_RUN_ID}" 2025-09-07T07:46:47.6737521Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T07:46:47.6737742Z env: 2025-09-07T07:46:47.6737890Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.6738060Z ##[endgroup] 2025-09-07T07:46:47.6892536Z ##[group]Run set -x 2025-09-07T07:46:47.6892771Z set -x 2025-09-07T07:46:47.6892914Z  2025-09-07T07:46:47.6893082Z if [[ $TEST_CONFIG == 'multigpu' ]]; then 2025-09-07T07:46:47.6893327Z  TEST_COMMAND=.ci/pytorch/multigpu-test.sh 2025-09-07T07:46:47.6893569Z elif [[ $BUILD_ENVIRONMENT == *onnx* ]]; then 2025-09-07T07:46:47.6893778Z  TEST_COMMAND=.ci/onnx/test.sh 2025-09-07T07:46:47.6893963Z else 2025-09-07T07:46:47.6894128Z  TEST_COMMAND=.ci/pytorch/test.sh 2025-09-07T07:46:47.6894315Z fi 2025-09-07T07:46:47.6894440Z  2025-09-07T07:46:47.6894607Z # Leaving 1GB for the runner and other things 2025-09-07T07:46:47.6894941Z TOTAL_AVAILABLE_MEMORY_IN_GB=$(awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo) 2025-09-07T07:46:47.6895444Z # https://docs.docker.com/engine/containers/resource_constraints/#--memory-swap-details, the 3GB swap 2025-09-07T07:46:47.6895842Z # comes from https://github.com/pytorch/test-infra/pull/6058 2025-09-07T07:46:47.6896152Z TOTAL_MEMORY_WITH_SWAP=$(("${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}" + 3)) 2025-09-07T07:46:47.6896392Z  2025-09-07T07:46:47.6896566Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-09-07T07:46:47.6896765Z  SHM_OPTS= 2025-09-07T07:46:47.6896926Z  JENKINS_USER= 2025-09-07T07:46:47.6897142Z  # ensure that docker container cleanly exits in 12 hours 2025-09-07T07:46:47.6897420Z  # if for some reason cleanup action doesn't stop container 2025-09-07T07:46:47.6897649Z  # when job is cancelled 2025-09-07T07:46:47.6897833Z  DOCKER_SHELL_CMD="sleep 12h" 2025-09-07T07:46:47.6898014Z else 2025-09-07T07:46:47.6898181Z  SHM_OPTS="--shm-size=${SHM_SIZE}" 2025-09-07T07:46:47.6898384Z  JENKINS_USER="--user jenkins" 2025-09-07T07:46:47.6898572Z  DOCKER_SHELL_CMD= 2025-09-07T07:46:47.6898739Z fi 2025-09-07T07:46:47.6898871Z  2025-09-07T07:46:47.6899078Z # detached container should get cleaned up by teardown_ec2_linux 2025-09-07T07:46:47.6899375Z # TODO: Stop building test binaries as part of the build phase 2025-09-07T07:46:47.6899713Z # Used for GPU_FLAG, SHM_OPTS, JENKINS_USER and DOCKER_SHELL_CMD since that doesn't play nice 2025-09-07T07:46:47.6900014Z # shellcheck disable=SC2086,SC2090 2025-09-07T07:46:47.6900222Z container_name=$(docker run \ 2025-09-07T07:46:47.6900413Z  ${GPU_FLAG:-} \ 2025-09-07T07:46:47.6900599Z  ${SCCACHE_SERVER_PORT_DOCKER_FLAG:-} \ 2025-09-07T07:46:47.6900808Z  -e BUILD_ENVIRONMENT \ 2025-09-07T07:46:47.6900990Z  -e PR_NUMBER \ 2025-09-07T07:46:47.6901161Z  -e GITHUB_ACTIONS \ 2025-09-07T07:46:47.6901337Z  -e GITHUB_REPOSITORY \ 2025-09-07T07:46:47.6901521Z  -e GITHUB_WORKFLOW \ 2025-09-07T07:46:47.6901703Z  -e GITHUB_JOB \ 2025-09-07T07:46:47.6901870Z  -e GITHUB_RUN_ID \ 2025-09-07T07:46:47.6902039Z  -e GITHUB_RUN_NUMBER \ 2025-09-07T07:46:47.6902220Z  -e GITHUB_RUN_ATTEMPT \ 2025-09-07T07:46:47.6902398Z  -e JOB_ID \ 2025-09-07T07:46:47.6902558Z  -e JOB_NAME \ 2025-09-07T07:46:47.6902715Z  -e BASE_SHA \ 2025-09-07T07:46:47.6902876Z  -e BRANCH \ 2025-09-07T07:46:47.6903029Z  -e SHA1 \ 2025-09-07T07:46:47.6903190Z  -e AWS_DEFAULT_REGION \ 2025-09-07T07:46:47.6903363Z  -e IN_WHEEL_TEST \ 2025-09-07T07:46:47.6903534Z  -e SHARD_NUMBER \ 2025-09-07T07:46:47.6903706Z  -e TEST_CONFIG \ 2025-09-07T07:46:47.6903878Z  -e NUM_TEST_SHARDS \ 2025-09-07T07:46:47.6904052Z  -e REENABLED_ISSUES \ 2025-09-07T07:46:47.6904242Z  -e CONTINUE_THROUGH_ERROR \ 2025-09-07T07:46:47.6904528Z  -e VERBOSE_TEST_LOGS \ 2025-09-07T07:46:47.6904786Z  -e TEST_SHOWLOCALS \ 2025-09-07T07:46:47.6904958Z  -e NO_TEST_TIMEOUT \ 2025-09-07T07:46:47.6905134Z  -e NO_TD \ 2025-09-07T07:46:47.6905306Z  -e TD_DISTRIBUTED \ 2025-09-07T07:46:47.6905485Z  -e PR_LABELS \ 2025-09-07T07:46:47.6905687Z  -e MAX_JOBS="$(nproc --ignore=2)" \ 2025-09-07T07:46:47.6905887Z  -e SCCACHE_BUCKET \ 2025-09-07T07:46:47.6906068Z  -e SCCACHE_REGION \ 2025-09-07T07:46:47.6906242Z  -e XLA_CUDA \ 2025-09-07T07:46:47.6906424Z  -e XLA_CLANG_CACHE_S3_BUCKET_NAME \ 2025-09-07T07:46:47.6906631Z  -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK \ 2025-09-07T07:46:47.6906852Z  -e PYTORCH_TEST_RERUN_DISABLED_TESTS \ 2025-09-07T07:46:47.6907070Z  -e SKIP_SCCACHE_INITIALIZATION=1 \ 2025-09-07T07:46:47.6907274Z  -e HUGGING_FACE_HUB_TOKEN \ 2025-09-07T07:46:47.6907472Z  -e VLLM_TEST_HUGGING_FACE_TOKEN \ 2025-09-07T07:46:47.6907683Z  -e SCRIBE_GRAPHQL_ACCESS_TOKEN \ 2025-09-07T07:46:47.6907876Z  -e DASHBOARD_TAG \ 2025-09-07T07:46:47.6908053Z  -e ARTIFACTS_FILE_SUFFIX \ 2025-09-07T07:46:47.6908267Z  --memory="${TOTAL_AVAILABLE_MEMORY_IN_GB%.*}g" \ 2025-09-07T07:46:47.6908513Z  --memory-swap="${TOTAL_MEMORY_WITH_SWAP}g" \ 2025-09-07T07:46:47.6908761Z  --env-file="/tmp/github_env_${GITHUB_RUN_ID}" \ 2025-09-07T07:46:47.6908997Z  --security-opt seccomp=unconfined \ 2025-09-07T07:46:47.6909206Z  --cap-add=SYS_PTRACE \ 2025-09-07T07:46:47.6909378Z  --ipc=host \ 2025-09-07T07:46:47.6909541Z  ${SHM_OPTS} \ 2025-09-07T07:46:47.6909700Z  --tty \ 2025-09-07T07:46:47.6909850Z  --detach \ 2025-09-07T07:46:47.6910013Z  --name="${container_name}" \ 2025-09-07T07:46:47.6910201Z  ${JENKINS_USER} \ 2025-09-07T07:46:47.6910417Z  -v "${GITHUB_WORKSPACE}:/var/lib/jenkins/workspace" \ 2025-09-07T07:46:47.6910653Z  -w /var/lib/jenkins/workspace \ 2025-09-07T07:46:47.6910837Z  "${DOCKER_IMAGE}" \ 2025-09-07T07:46:47.6911009Z  ${DOCKER_SHELL_CMD} 2025-09-07T07:46:47.6911171Z ) 2025-09-07T07:46:47.6911357Z # Propagate download.pytorch.org IP to container 2025-09-07T07:46:47.6911729Z grep download.pytorch.org /etc/hosts | docker exec -i "${container_name}" sudo bash -c "/bin/cat >> /etc/hosts" 2025-09-07T07:46:47.6912125Z echo "DOCKER_CONTAINER_ID=${container_name}" >> "${GITHUB_ENV}" 2025-09-07T07:46:47.6912362Z  2025-09-07T07:46:47.6912539Z if [[ ${BUILD_ENVIRONMENT} == *"s390x"* ]]; then 2025-09-07T07:46:47.6912870Z  docker exec -t "${container_name}" sh -c "python3 -m pip install -r .ci/docker/requirements-ci.txt" 2025-09-07T07:46:47.6913154Z fi 2025-09-07T07:46:47.6913293Z  2025-09-07T07:46:47.6913579Z docker exec -t "${container_name}" sh -c "python3 -m pip install $(echo dist/*.whl)[opt-einsum] && ${TEST_COMMAND}" 2025-09-07T07:46:47.6917216Z shell: /usr/bin/bash -e {0} 2025-09-07T07:46:47.6917395Z env: 2025-09-07T07:46:47.6917541Z GIT_DEFAULT_BRANCH: main 2025-09-07T07:46:47.6917760Z BUILD_ENVIRONMENT: linux-jammy-py3.9-gcc11-build 2025-09-07T07:46:47.6917985Z PR_NUMBER: 2025-09-07T07:46:47.6918156Z GITHUB_REPOSITORY: pytorch/pytorch 2025-09-07T07:46:47.6918377Z GITHUB_WORKFLOW: inductor-perf-nightly-x86 2025-09-07T07:46:47.6918587Z GITHUB_JOB: test 2025-09-07T07:46:47.6918752Z GITHUB_RUN_ID: 17525285611 2025-09-07T07:46:47.6918930Z GITHUB_RUN_NUMBER: 525 2025-09-07T07:46:47.6919093Z GITHUB_RUN_ATTEMPT: 1 2025-09-07T07:46:47.6919258Z JOB_ID: 49775585769 2025-09-07T07:46:47.6919592Z JOB_NAME: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:46:47.6919943Z BRANCH: main 2025-09-07T07:46:47.6920190Z SHA1: 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:46:47.6920473Z BASE_SHA: 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:46:47.6920702Z TEST_CONFIG: inductor_huggingface_perf_cpu_x86 2025-09-07T07:46:47.6920903Z SHARD_NUMBER: 1 2025-09-07T07:46:47.6921049Z NUM_TEST_SHARDS: 3 2025-09-07T07:46:47.6921205Z REENABLED_ISSUES: 2025-09-07T07:46:47.6921367Z CONTINUE_THROUGH_ERROR: True 2025-09-07T07:46:47.6921549Z VERBOSE_TEST_LOGS: False 2025-09-07T07:46:47.6921711Z TEST_SHOWLOCALS: False 2025-09-07T07:46:47.6921879Z NO_TEST_TIMEOUT: False 2025-09-07T07:46:47.6922034Z NO_TD: False 2025-09-07T07:46:47.6922176Z TD_DISTRIBUTED: False 2025-09-07T07:46:47.6922366Z SCCACHE_BUCKET: ossci-compiler-cache-circleci-v2 2025-09-07T07:46:47.6922580Z SCCACHE_REGION: us-east-1 2025-09-07T07:46:47.6922745Z SHM_SIZE: 1g 2025-09-07T07:46:47.6923211Z DOCKER_IMAGE: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:46:47.6923697Z XLA_CUDA: 2025-09-07T07:46:47.6923914Z XLA_CLANG_CACHE_S3_BUCKET_NAME: ossci-compiler-clang-cache-circleci-xla 2025-09-07T07:46:47.6924178Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK: 0 2025-09-07T07:46:47.6924397Z PYTORCH_TEST_RERUN_DISABLED_TESTS: 0 2025-09-07T07:46:47.6924824Z DASHBOARD_TAG: training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true 2025-09-07T07:46:47.6925384Z VLLM_TEST_HUGGING_FACE_TOKEN: *** 2025-09-07T07:46:47.6925647Z HUGGING_FACE_HUB_TOKEN: *** 2025-09-07T07:46:47.6925901Z SCRIBE_GRAPHQL_ACCESS_TOKEN: *** 2025-09-07T07:46:47.6926216Z ARTIFACTS_FILE_SUFFIX: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T07:46:47.6926529Z ##[endgroup] 2025-09-07T07:46:47.6943009Z + [[ inductor_huggingface_perf_cpu_x86 == \m\u\l\t\i\g\p\u ]] 2025-09-07T07:46:47.6943278Z + [[ linux-jammy-py3.9-gcc11-build == *onnx* ]] 2025-09-07T07:46:47.6943510Z + TEST_COMMAND=.ci/pytorch/test.sh 2025-09-07T07:46:47.6946542Z ++ awk '/MemTotal/ { printf "%.3f \n", $2/1024/1024 - 1 }' /proc/meminfo 2025-09-07T07:46:47.6961924Z + TOTAL_AVAILABLE_MEMORY_IN_GB='187.488 ' 2025-09-07T07:46:47.6962369Z + TOTAL_MEMORY_WITH_SWAP=190 2025-09-07T07:46:47.6962657Z + [[ linux-jammy-py3.9-gcc11-build == *\s\3\9\0\x* ]] 2025-09-07T07:46:47.6962900Z + SHM_OPTS=--shm-size=1g 2025-09-07T07:46:47.6963084Z + JENKINS_USER='--user jenkins' 2025-09-07T07:46:47.6963263Z + DOCKER_SHELL_CMD= 2025-09-07T07:46:47.6970423Z +++ nproc --ignore=2 2025-09-07T07:46:47.7356014Z ++ docker run -e BUILD_ENVIRONMENT -e PR_NUMBER -e GITHUB_ACTIONS -e GITHUB_REPOSITORY -e GITHUB_WORKFLOW -e GITHUB_JOB -e GITHUB_RUN_ID -e GITHUB_RUN_NUMBER -e GITHUB_RUN_ATTEMPT -e JOB_ID -e JOB_NAME -e BASE_SHA -e BRANCH -e SHA1 -e AWS_DEFAULT_REGION -e IN_WHEEL_TEST -e SHARD_NUMBER -e TEST_CONFIG -e NUM_TEST_SHARDS -e REENABLED_ISSUES -e CONTINUE_THROUGH_ERROR -e VERBOSE_TEST_LOGS -e TEST_SHOWLOCALS -e NO_TEST_TIMEOUT -e NO_TD -e TD_DISTRIBUTED -e PR_LABELS -e MAX_JOBS=94 -e SCCACHE_BUCKET -e SCCACHE_REGION -e XLA_CUDA -e XLA_CLANG_CACHE_S3_BUCKET_NAME -e PYTORCH_TEST_CUDA_MEM_LEAK_CHECK -e PYTORCH_TEST_RERUN_DISABLED_TESTS -e SKIP_SCCACHE_INITIALIZATION=1 -e HUGGING_FACE_HUB_TOKEN -e VLLM_TEST_HUGGING_FACE_TOKEN -e SCRIBE_GRAPHQL_ACCESS_TOKEN -e DASHBOARD_TAG -e ARTIFACTS_FILE_SUFFIX --memory=187g --memory-swap=190g --env-file=/tmp/github_env_17525285611 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --ipc=host --shm-size=1g --tty --detach --name= --user jenkins -v /home/ec2-user/actions-runner/_work/pytorch/pytorch:/var/lib/jenkins/workspace -w /var/lib/jenkins/workspace 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T07:49:24.7727351Z + container_name=7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T07:49:24.7730522Z + grep download.pytorch.org /etc/hosts 2025-09-07T07:49:24.7731807Z + docker exec -i 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 sudo bash -c '/bin/cat >> /etc/hosts' 2025-09-07T07:49:24.8809028Z + echo DOCKER_CONTAINER_ID=7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T07:49:24.8810557Z + [[ linux-jammy-py3.9-gcc11-build == *\s\3\9\0\x* ]] 2025-09-07T07:49:24.8813405Z ++ echo dist/torch-2.9.0a0+git93fb23d-cp39-cp39-linux_x86_64.whl 2025-09-07T07:49:24.8815028Z + docker exec -t 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 sh -c 'python3 -m pip install dist/torch-2.9.0a0+git93fb23d-cp39-cp39-linux_x86_64.whl[opt-einsum] && .ci/pytorch/test.sh' 2025-09-07T07:49:25.1822726Z Processing ./dist/torch-2.9.0a0+git93fb23d-cp39-cp39-linux_x86_64.whl (from torch==2.9.0a0+git93fb23d) 2025-09-07T07:49:25.3811599Z Requirement already satisfied: filelock in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (3.19.1) 2025-09-07T07:49:25.3812345Z Requirement already satisfied: typing-extensions>=4.10.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (4.15.0) 2025-09-07T07:49:25.3813044Z Requirement already satisfied: sympy>=1.13.3 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (1.13.3) 2025-09-07T07:49:25.3813702Z Requirement already satisfied: networkx>=2.5.1 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (2.8.8) 2025-09-07T07:49:25.3814419Z Requirement already satisfied: jinja2 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (3.1.6) 2025-09-07T07:49:25.3817516Z Requirement already satisfied: fsspec>=0.8.5 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (2025.3.0) 2025-09-07T07:49:25.3827398Z Requirement already satisfied: opt-einsum>=3.3 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (3.3.0) 2025-09-07T07:49:25.4089406Z Requirement already satisfied: numpy>=1.7 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from opt-einsum>=3.3->torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (1.22.4) 2025-09-07T07:49:25.4101265Z Requirement already satisfied: mpmath<1.4,>=1.1.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from sympy>=1.13.3->torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (1.3.0) 2025-09-07T07:49:25.4132563Z Requirement already satisfied: MarkupSafe>=2.0 in /opt/conda/envs/py_3.9/lib/python3.9/site-packages (from jinja2->torch==2.9.0a0+git93fb23d->torch==2.9.0a0+git93fb23d) (3.0.2) 2025-09-07T07:49:26.0959911Z Installing collected packages: torch 2025-09-07T07:49:32.5575957Z ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 2025-09-07T07:49:32.5576582Z dall-e 0.1 requires torchvision, which is not installed. 2025-09-07T07:49:32.5576854Z effdet 0.4.1 requires torchvision, which is not installed. 2025-09-07T07:49:32.5577176Z pytorch-labs-segment-anything-fast 0.2 requires torchao, which is not installed. 2025-09-07T07:49:32.5577604Z pytorch-labs-segment-anything-fast 0.2 requires torchvision>=0.17.0.dev20231026, which is not installed. 2025-09-07T07:49:32.5578058Z timm 1.0.14 requires torchvision, which is not installed. 2025-09-07T07:49:32.5578367Z Successfully installed torch-2.9.0a0+git93fb23d 2025-09-07T07:49:32.6482459Z + export TERM=vt100 2025-09-07T07:49:32.6482666Z + TERM=vt100 2025-09-07T07:49:32.6482825Z ++ dirname .ci/pytorch/test.sh 2025-09-07T07:49:32.6485272Z + source .ci/pytorch/common.sh 2025-09-07T07:49:32.6487784Z +++ dirname .ci/pytorch/common.sh 2025-09-07T07:49:32.6493207Z ++ source .ci/pytorch/common_utils.sh 2025-09-07T07:49:32.6493877Z +++ declare -f -t trap_add 2025-09-07T07:49:32.6497357Z ++ set -ex -o pipefail 2025-09-07T07:49:32.6497700Z ++ [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-09-07T07:49:32.6497927Z ++ BUILD_TEST_LIBTORCH=0 2025-09-07T07:49:32.6507492Z ++ dirname .ci/pytorch/test.sh 2025-09-07T07:49:32.6512695Z + source .ci/pytorch/common-build.sh 2025-09-07T07:49:32.6513744Z ++ [[ linux-jammy-py3.9-gcc11-build != *win-* ]] 2025-09-07T07:49:32.6521416Z ++++ dirname .ci/pytorch/common-build.sh 2025-09-07T07:49:32.6527049Z +++ cd .ci/pytorch 2025-09-07T07:49:32.6527716Z +++ pwd -P 2025-09-07T07:49:32.6529010Z ++ script_dir=/var/lib/jenkins/workspace/.ci/pytorch 2025-09-07T07:49:32.6529336Z ++ [[ linux-jammy-py3.9-gcc11-build == *-pch* ]] 2025-09-07T07:49:32.6529549Z ++ which sccache 2025-09-07T07:49:32.6544288Z ++ [[ -z ossci-compiler-cache-circleci-v2 ]] 2025-09-07T07:49:32.6544527Z ++ sccache --stop-server 2025-09-07T07:49:32.6567918Z ++ true 2025-09-07T07:49:32.6568092Z ++ rm -f /var/lib/jenkins/sccache_error.log 2025-09-07T07:49:32.6575077Z ++ trap_add sccache_epilogue EXIT 2025-09-07T07:49:32.6575312Z ++ trap_add_cmd=sccache_epilogue 2025-09-07T07:49:32.6575530Z ++ shift 2025-09-07T07:49:32.6575695Z ++ for trap_add_name in "$@" 2025-09-07T07:49:32.6583543Z ++++ trap -p EXIT 2025-09-07T07:49:32.6584853Z +++ eval 'extract_trap_cmd ' 2025-09-07T07:49:32.6585061Z ++++ extract_trap_cmd 2025-09-07T07:49:32.6585235Z ++++ printf '%s\n' '' 2025-09-07T07:49:32.6585492Z +++ printf '%s\n' sccache_epilogue 2025-09-07T07:49:32.6586657Z ++ trap -- ' 2025-09-07T07:49:32.6586820Z sccache_epilogue' EXIT 2025-09-07T07:49:32.6586990Z ++ [[ -n 1 ]] 2025-09-07T07:49:32.6587253Z ++ echo 'Skipping sccache server initialization, setting environment variables' 2025-09-07T07:49:32.6587635Z Skipping sccache server initialization, setting environment variables 2025-09-07T07:49:32.6587919Z ++ export SCCACHE_IDLE_TIMEOUT=0 2025-09-07T07:49:32.6588118Z ++ SCCACHE_IDLE_TIMEOUT=0 2025-09-07T07:49:32.6588350Z ++ export SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-09-07T07:49:32.6588660Z ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-09-07T07:49:32.6588957Z ++ export RUST_LOG=sccache::server=error 2025-09-07T07:49:32.6589170Z ++ RUST_LOG=sccache::server=error 2025-09-07T07:49:32.6589367Z ++ sccache --zero-stats 2025-09-07T07:49:32.8025195Z Statistics zeroed. 2025-09-07T07:49:32.8030321Z ++ which ccache 2025-09-07T07:49:32.8043640Z + [[ linux-jammy-py3.9-gcc11-build != *rocm* ]] 2025-09-07T07:49:32.8043916Z + [[ linux-jammy-py3.9-gcc11-build != *s390x* ]] 2025-09-07T07:49:32.8044179Z + [[ -d /var/lib/jenkins/workspace ]] 2025-09-07T07:49:32.8046022Z ++ stat -c %u /var/lib/jenkins/workspace 2025-09-07T07:49:32.8054916Z + WORKSPACE_ORIGINAL_OWNER_ID=1000 2025-09-07T07:49:32.8055146Z + trap_add cleanup_workspace EXIT 2025-09-07T07:49:32.8055372Z + trap_add_cmd=cleanup_workspace 2025-09-07T07:49:32.8055579Z + shift 2025-09-07T07:49:32.8055753Z + for trap_add_name in "$@" 2025-09-07T07:49:32.8060885Z +++ trap -p EXIT 2025-09-07T07:49:32.8062761Z ++ eval 'extract_trap_cmd trap -- '\'' 2025-09-07T07:49:32.8063015Z sccache_epilogue'\'' EXIT' 2025-09-07T07:49:32.8063229Z +++ extract_trap_cmd trap -- ' 2025-09-07T07:49:32.8063434Z sccache_epilogue' EXIT 2025-09-07T07:49:32.8063616Z +++ printf '%s\n' ' 2025-09-07T07:49:32.8063796Z sccache_epilogue' 2025-09-07T07:49:32.8063989Z ++ printf '%s\n' cleanup_workspace 2025-09-07T07:49:32.8064916Z + trap -- ' 2025-09-07T07:49:32.8065071Z sccache_epilogue 2025-09-07T07:49:32.8065256Z cleanup_workspace' EXIT 2025-09-07T07:49:32.8065486Z + sudo chown -R jenkins /var/lib/jenkins/workspace 2025-09-07T07:49:33.1953910Z + git config --global --add safe.directory /var/lib/jenkins/workspace 2025-09-07T07:49:33.1959840Z + echo 'Environment variables:' 2025-09-07T07:49:33.1960062Z Environment variables: 2025-09-07T07:49:33.1960259Z + env 2025-09-07T07:49:33.1966575Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-09-07T07:49:33.1966886Z CONTINUE_THROUGH_ERROR=True 2025-09-07T07:49:33.1967104Z BUILD_ENVIRONMENT=linux-jammy-py3.9-gcc11-build 2025-09-07T07:49:33.1969051Z VLLM_TEST_HUGGING_FACE_TOKEN=*** 2025-09-07T07:49:33.1969252Z HOSTNAME=7e583f71185a 2025-09-07T07:49:33.1969606Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.1969975Z GITHUB_ACTION=__run_2 2025-09-07T07:49:33.1970159Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-09-07T07:49:33.1970341Z GITHUB_RUN_NUMBER=525 2025-09-07T07:49:33.1970534Z TEST_CONFIG=inductor_huggingface_perf_cpu_x86 2025-09-07T07:49:33.1970760Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-09-07T07:49:33.1970976Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-09-07T07:49:33.1971171Z SCCACHE_IDLE_TIMEOUT=0 2025-09-07T07:49:33.1971441Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-09-07T07:49:33.1971640Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-09-07T07:49:33.1971833Z GITHUB_REF_TYPE=branch 2025-09-07T07:49:33.1972019Z BASE_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.1972230Z XLA_CUDA= 2025-09-07T07:49:33.1972390Z NCCL_LIB_DIR=/usr/local/cuda/lib64/ 2025-09-07T07:49:33.1972657Z HUGGING_FACE_HUB_TOKEN=*** 2025-09-07T07:49:33.1975847Z *** 2025-09-07T07:49:33.1975997Z GITHUB_REPOSITORY_ID=65600975 2025-09-07T07:49:33.1976185Z GITHUB_ACTIONS=true 2025-09-07T07:49:33.1976386Z SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-09-07T07:49:33.1976626Z SHA1=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.1976853Z GITHUB_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.1977246Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor-perf-test-nightly-x86.yml@refs/heads/main 2025-09-07T07:49:33.1977596Z UCC_HOME=/usr 2025-09-07T07:49:33.1977750Z VERBOSE_TEST_LOGS=False 2025-09-07T07:49:33.1977918Z GITHUB_REF=refs/heads/main 2025-09-07T07:49:33.1978091Z SHARD_NUMBER=1 2025-09-07T07:49:33.1978248Z GITHUB_REF_PROTECTED=true 2025-09-07T07:49:33.1978419Z HOME=/var/lib/jenkins 2025-09-07T07:49:33.1978606Z GITHUB_API_URL=https://api.github.com 2025-09-07T07:49:33.1978818Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-09-07T07:49:33.1979012Z UCX_COMMIT= 2025-09-07T07:49:33.1979158Z USE_SYSTEM_NCCL=1 2025-09-07T07:49:33.1979306Z NUM_TEST_SHARDS=3 2025-09-07T07:49:33.1979454Z UCX_HOME=/usr 2025-09-07T07:49:33.1979790Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.1980335Z JOB_NAME=inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:49:33.1980846Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.1981360Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-09-07T07:49:33.1981656Z GITHUB_EVENT_NAME=schedule 2025-09-07T07:49:33.1982102Z DASHBOARD_TAG=training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true 2025-09-07T07:49:33.1982547Z GITHUB_RUN_ID=17525285611 2025-09-07T07:49:33.1982716Z INSTALLED_OPENBLAS= 2025-09-07T07:49:33.1983081Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.1983473Z GITHUB_ACTOR=pytorchmergebot 2025-09-07T07:49:33.1983657Z PR_NUMBER= 2025-09-07T07:49:33.1983796Z DESIRED_CUDA= 2025-09-07T07:49:33.1983949Z GITHUB_RUN_ATTEMPT=1 2025-09-07T07:49:33.1984124Z ANACONDA_PYTHON_VERSION=3.9 2025-09-07T07:49:33.1984348Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-09-07T07:49:33.1984562Z TERM=vt100 2025-09-07T07:49:33.1984708Z INSTALLED_VISION=yes 2025-09-07T07:49:33.1984866Z BRANCH=main 2025-09-07T07:49:33.1985020Z SCCACHE_REGION=us-east-1 2025-09-07T07:49:33.1985193Z OPENSSL_ROOT_DIR=/opt/openssl 2025-09-07T07:49:33.1985378Z CUDA_PATH=/usr/local/cuda 2025-09-07T07:49:33.1985693Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-09-07T07:49:33.1986033Z GITHUB_SERVER_URL=https://github.com 2025-09-07T07:49:33.1986415Z UCC_COMMIT= 2025-09-07T07:49:33.1986564Z REENABLED_ISSUES= 2025-09-07T07:49:33.1986720Z DOCS=yes 2025-09-07T07:49:33.1986863Z SHLVL=1 2025-09-07T07:49:33.1986992Z MAX_JOBS=94 2025-09-07T07:49:33.1987140Z GITHUB_ACTOR_ID=97764156 2025-09-07T07:49:33.1987369Z GITHUB_WORKFLOW_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.1987613Z GITHUB_REF_NAME=main 2025-09-07T07:49:33.1987861Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-09-07T07:49:33.1988122Z GITHUB_JOB=test 2025-09-07T07:49:33.1988280Z NO_TEST_TIMEOUT=False 2025-09-07T07:49:33.1988440Z TD_DISTRIBUTED=False 2025-09-07T07:49:33.1988611Z GITHUB_REPOSITORY=pytorch/pytorch 2025-09-07T07:49:33.1988806Z GITHUB_RETENTION_DAYS=90 2025-09-07T07:49:33.1988978Z OPENSSL_DIR=/opt/openssl 2025-09-07T07:49:33.1989143Z GITHUB_ACTION_REPOSITORY= 2025-09-07T07:49:33.1989598Z PATH=/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-09-07T07:49:33.1990045Z GITHUB_BASE_REF= 2025-09-07T07:49:33.1990200Z INSTALLED_ACL= 2025-09-07T07:49:33.1990492Z ARTIFACTS_FILE_SUFFIX=test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T07:49:33.1990798Z CI=true 2025-09-07T07:49:33.1990948Z GITHUB_REPOSITORY_OWNER=pytorch 2025-09-07T07:49:33.1991182Z RUST_LOG=sccache::server=error 2025-09-07T07:49:33.1991358Z JOB_ID=49775585769 2025-09-07T07:49:33.1991503Z GITHUB_HEAD_REF= 2025-09-07T07:49:33.1991657Z GITHUB_ACTION_REF= 2025-09-07T07:49:33.1991850Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-09-07T07:49:33.1992070Z TEST_SHOWLOCALS=False 2025-09-07T07:49:33.1992249Z GITHUB_WORKFLOW=inductor-perf-nightly-x86 2025-09-07T07:49:33.1992465Z DEBIAN_FRONTEND=noninteractive 2025-09-07T07:49:33.1992828Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.1993187Z NO_TD=False 2025-09-07T07:49:33.1993339Z SKIP_SCCACHE_INITIALIZATION=1 2025-09-07T07:49:33.1993545Z NCCL_INCLUDE_DIR=/usr/local/cuda/include/ 2025-09-07T07:49:33.1993748Z _=/usr/bin/env 2025-09-07T07:49:33.1993967Z ++ python -c 'import site; print(site.getsitepackages()[0])' 2025-09-07T07:49:33.2220953Z + TORCH_INSTALL_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch 2025-09-07T07:49:33.2221448Z + TORCH_BIN_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/bin 2025-09-07T07:49:33.2223673Z + TORCH_LIB_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/lib 2025-09-07T07:49:33.2224145Z + TORCH_TEST_DIR=/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/test 2025-09-07T07:49:33.2224437Z + BUILD_DIR=build 2025-09-07T07:49:33.2224615Z + BUILD_RENAMED_DIR=build_renamed 2025-09-07T07:49:33.2224821Z + BUILD_BIN_DIR=build/bin 2025-09-07T07:49:33.2225003Z + SHARD_NUMBER=1 2025-09-07T07:49:33.2225166Z + NUM_TEST_SHARDS=3 2025-09-07T07:49:33.2225342Z + export TORCH_SERIALIZATION_DEBUG=1 2025-09-07T07:49:33.2225570Z + TORCH_SERIALIZATION_DEBUG=1 2025-09-07T07:49:33.2225764Z + export VALGRIND=ON 2025-09-07T07:49:33.2225929Z + VALGRIND=ON 2025-09-07T07:49:33.2226110Z + [[ linux-jammy-py3.9-gcc11-build == *clang9* ]] 2025-09-07T07:49:33.2226358Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-09-07T07:49:33.2226568Z + detect_cuda_arch 2025-09-07T07:49:33.2226767Z + [[ linux-jammy-py3.9-gcc11-build == *cuda* ]] 2025-09-07T07:49:33.2227000Z + [[ linux-jammy-py3.9-gcc11-build == *s390x* ]] 2025-09-07T07:49:33.2227211Z + [[ 0 == \1 ]] 2025-09-07T07:49:33.2227359Z + [[ True == \1 ]] 2025-09-07T07:49:33.2227537Z + [[ linux-jammy-py3.9-gcc11-build != *bazel* ]] 2025-09-07T07:49:33.2227750Z ++ realpath build/custom_test_artifacts 2025-09-07T07:49:33.2229516Z + CUSTOM_TEST_ARTIFACT_BUILD_DIR=/var/lib/jenkins/workspace/build/custom_test_artifacts 2025-09-07T07:49:33.2229823Z + [[ -n '' ]] 2025-09-07T07:49:33.2229998Z + echo 'Environment variables' 2025-09-07T07:49:33.2230186Z Environment variables 2025-09-07T07:49:33.2230341Z + env 2025-09-07T07:49:33.2245195Z GITHUB_WORKSPACE=/home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-09-07T07:49:33.2245470Z CONTINUE_THROUGH_ERROR=True 2025-09-07T07:49:33.2245694Z BUILD_ENVIRONMENT=linux-jammy-py3.9-gcc11-build 2025-09-07T07:49:33.2246089Z VLLM_TEST_HUGGING_FACE_TOKEN=*** 2025-09-07T07:49:33.2246276Z HOSTNAME=7e583f71185a 2025-09-07T07:49:33.2246637Z GITHUB_PATH=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/add_path_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.2247009Z GITHUB_ACTION=__run_2 2025-09-07T07:49:33.2247193Z PYTORCH_TEST_CUDA_MEM_LEAK_CHECK=0 2025-09-07T07:49:33.2247378Z GITHUB_RUN_NUMBER=525 2025-09-07T07:49:33.2247592Z TEST_CONFIG=inductor_huggingface_perf_cpu_x86 2025-09-07T07:49:33.2247827Z GITHUB_REPOSITORY_OWNER_ID=21003710 2025-09-07T07:49:33.2248062Z TORCH_NVCC_FLAGS=-Xfatbin -compress-all 2025-09-07T07:49:33.2248277Z SCCACHE_IDLE_TIMEOUT=0 2025-09-07T07:49:33.2248536Z SCRIBE_GRAPHQL_ACCESS_TOKEN=*** 2025-09-07T07:49:33.2248740Z GITHUB_TRIGGERING_ACTOR=pytorchmergebot 2025-09-07T07:49:33.2248946Z GITHUB_REF_TYPE=branch 2025-09-07T07:49:33.2249150Z BASE_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.2249363Z XLA_CUDA= 2025-09-07T07:49:33.2249514Z NCCL_LIB_DIR=/usr/local/cuda/lib64/ 2025-09-07T07:49:33.2249849Z HUGGING_FACE_HUB_TOKEN=*** 2025-09-07T07:49:33.2250067Z *** 2025-09-07T07:49:33.2250222Z GITHUB_REPOSITORY_ID=65600975 2025-09-07T07:49:33.2250410Z GITHUB_ACTIONS=true 2025-09-07T07:49:33.2250603Z SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 2025-09-07T07:49:33.2250860Z SHA1=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.2251107Z GITHUB_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.2251501Z GITHUB_WORKFLOW_REF=pytorch/pytorch/.github/workflows/inductor-perf-test-nightly-x86.yml@refs/heads/main 2025-09-07T07:49:33.2251904Z UCC_HOME=/usr 2025-09-07T07:49:33.2252059Z TORCH_SERIALIZATION_DEBUG=1 2025-09-07T07:49:33.2252244Z VERBOSE_TEST_LOGS=False 2025-09-07T07:49:33.2252423Z GITHUB_REF=refs/heads/main 2025-09-07T07:49:33.2252599Z SHARD_NUMBER=1 2025-09-07T07:49:33.2252752Z GITHUB_REF_PROTECTED=true 2025-09-07T07:49:33.2252925Z HOME=/var/lib/jenkins 2025-09-07T07:49:33.2253116Z GITHUB_API_URL=https://api.github.com 2025-09-07T07:49:33.2253336Z PYTORCH_TEST_RERUN_DISABLED_TESTS=0 2025-09-07T07:49:33.2253520Z UCX_COMMIT= 2025-09-07T07:49:33.2253669Z USE_SYSTEM_NCCL=1 2025-09-07T07:49:33.2253831Z NUM_TEST_SHARDS=3 2025-09-07T07:49:33.2253983Z UCX_HOME=/usr 2025-09-07T07:49:33.2254317Z GITHUB_STATE=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/save_state_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.2254869Z JOB_NAME=inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T07:49:33.2255402Z GITHUB_ENV=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_env_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.2255877Z GITHUB_EVENT_PATH=/home/ec2-user/actions-runner/_work/_temp/_github_workflow/event.json 2025-09-07T07:49:33.2256187Z GITHUB_EVENT_NAME=schedule 2025-09-07T07:49:33.2256637Z DASHBOARD_TAG=training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true 2025-09-07T07:49:33.2257096Z GITHUB_RUN_ID=17525285611 2025-09-07T07:49:33.2257276Z INSTALLED_OPENBLAS= 2025-09-07T07:49:33.2257650Z GITHUB_STEP_SUMMARY=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/step_summary_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.2258042Z GITHUB_ACTOR=pytorchmergebot 2025-09-07T07:49:33.2258229Z PR_NUMBER= 2025-09-07T07:49:33.2258380Z DESIRED_CUDA= 2025-09-07T07:49:33.2258535Z GITHUB_RUN_ATTEMPT=1 2025-09-07T07:49:33.2258695Z VALGRIND=ON 2025-09-07T07:49:33.2258854Z ANACONDA_PYTHON_VERSION=3.9 2025-09-07T07:49:33.2259081Z GITHUB_GRAPHQL_URL=https://api.github.com/graphql 2025-09-07T07:49:33.2259304Z TERM=vt100 2025-09-07T07:49:33.2259447Z INSTALLED_VISION=yes 2025-09-07T07:49:33.2259612Z BRANCH=main 2025-09-07T07:49:33.2259815Z SCCACHE_REGION=us-east-1 2025-09-07T07:49:33.2260044Z OPENSSL_ROOT_DIR=/opt/openssl 2025-09-07T07:49:33.2260226Z CUDA_PATH=/usr/local/cuda 2025-09-07T07:49:33.2260545Z GITHUB_ACTION_PATH=/home/ec2-user/actions-runner/_work/pytorch/pytorch/./.github/actions/setup-linux 2025-09-07T07:49:33.2260897Z GITHUB_SERVER_URL=https://github.com 2025-09-07T07:49:33.2261095Z UCC_COMMIT= 2025-09-07T07:49:33.2261236Z REENABLED_ISSUES= 2025-09-07T07:49:33.2261398Z DOCS=yes 2025-09-07T07:49:33.2261537Z SHLVL=1 2025-09-07T07:49:33.2261668Z MAX_JOBS=94 2025-09-07T07:49:33.2261820Z GITHUB_ACTOR_ID=97764156 2025-09-07T07:49:33.2262049Z GITHUB_WORKFLOW_SHA=93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T07:49:33.2262290Z GITHUB_REF_NAME=main 2025-09-07T07:49:33.2262535Z XLA_CLANG_CACHE_S3_BUCKET_NAME=ossci-compiler-clang-cache-circleci-xla 2025-09-07T07:49:33.2262799Z GITHUB_JOB=test 2025-09-07T07:49:33.2262958Z NO_TEST_TIMEOUT=False 2025-09-07T07:49:33.2263122Z TD_DISTRIBUTED=False 2025-09-07T07:49:33.2263284Z GITHUB_REPOSITORY=pytorch/pytorch 2025-09-07T07:49:33.2263486Z GITHUB_RETENTION_DAYS=90 2025-09-07T07:49:33.2263661Z OPENSSL_DIR=/opt/openssl 2025-09-07T07:49:33.2263837Z GITHUB_ACTION_REPOSITORY= 2025-09-07T07:49:33.2264289Z PATH=/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-09-07T07:49:33.2264740Z GITHUB_BASE_REF= 2025-09-07T07:49:33.2264898Z INSTALLED_ACL= 2025-09-07T07:49:33.2265195Z ARTIFACTS_FILE_SUFFIX=test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T07:49:33.2265517Z CI=true 2025-09-07T07:49:33.2265663Z GITHUB_REPOSITORY_OWNER=pytorch 2025-09-07T07:49:33.2265877Z RUST_LOG=sccache::server=error 2025-09-07T07:49:33.2266060Z JOB_ID=49775585769 2025-09-07T07:49:33.2266214Z GITHUB_HEAD_REF= 2025-09-07T07:49:33.2266362Z GITHUB_ACTION_REF= 2025-09-07T07:49:33.2266560Z SCCACHE_BUCKET=ossci-compiler-cache-circleci-v2 2025-09-07T07:49:33.2266788Z TEST_SHOWLOCALS=False 2025-09-07T07:49:33.2266985Z GITHUB_WORKFLOW=inductor-perf-nightly-x86 2025-09-07T07:49:33.2267205Z DEBIAN_FRONTEND=noninteractive 2025-09-07T07:49:33.2267577Z GITHUB_OUTPUT=/home/ec2-user/actions-runner/_work/_temp/_runner_file_commands/set_output_8e537892-2514-4c48-bd72-bef6cf48975e 2025-09-07T07:49:33.2267944Z NO_TD=False 2025-09-07T07:49:33.2268103Z SKIP_SCCACHE_INITIALIZATION=1 2025-09-07T07:49:33.2268299Z NCCL_INCLUDE_DIR=/usr/local/cuda/include/ 2025-09-07T07:49:33.2268500Z _=/usr/bin/env 2025-09-07T07:49:33.2268661Z + echo 'Testing pytorch' 2025-09-07T07:49:33.2268832Z Testing pytorch 2025-09-07T07:49:33.2268995Z + export LANG=C.UTF-8 2025-09-07T07:49:33.2269161Z + LANG=C.UTF-8 2025-09-07T07:49:33.2269314Z + PR_NUMBER= 2025-09-07T07:49:33.2269516Z + [[ inductor_huggingface_perf_cpu_x86 == \d\e\f\a\u\l\t ]] 2025-09-07T07:49:33.2269805Z + [[ inductor_huggingface_perf_cpu_x86 == \d\i\s\t\r\i\b\u\t\e\d ]] 2025-09-07T07:49:33.2270084Z + [[ inductor_huggingface_perf_cpu_x86 == \s\l\o\w ]] 2025-09-07T07:49:33.2270360Z + [[ linux-jammy-py3.9-gcc11-build == *slow-gradcheck* ]] 2025-09-07T07:49:33.2270624Z + [[ linux-jammy-py3.9-gcc11-build == *cuda* ]] 2025-09-07T07:49:33.2270852Z + [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-09-07T07:49:33.2271087Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-09-07T07:49:33.2271335Z + [[ inductor_huggingface_perf_cpu_x86 == *crossref* ]] 2025-09-07T07:49:33.2271580Z + [[ linux-jammy-py3.9-gcc11-build == *rocm* ]] 2025-09-07T07:49:33.2271804Z + [[ linux-jammy-py3.9-gcc11-build == *xpu* ]] 2025-09-07T07:49:33.2272049Z + [[ linux-jammy-py3.9-gcc11-build != *-bazel-* ]] 2025-09-07T07:49:33.2272275Z + pip_install ninja==1.10.2 2025-09-07T07:49:33.2272517Z + pip_install_pkg='python3 -m pip install --progress-bar off' 2025-09-07T07:49:33.2272801Z + python3 -m pip install --progress-bar off ninja==1.10.2 2025-09-07T07:49:33.5534370Z Collecting ninja==1.10.2 2025-09-07T07:49:33.5765074Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl.metadata (5.0 kB) 2025-09-07T07:49:33.5906865Z Downloading ninja-1.10.2-py2.py3-none-manylinux_2_5_x86_64.manylinux1_x86_64.whl (108 kB) 2025-09-07T07:49:34.2767792Z Installing collected packages: ninja 2025-09-07T07:49:34.2768153Z Attempting uninstall: ninja 2025-09-07T07:49:34.2768402Z Found existing installation: ninja 1.11.1.3 2025-09-07T07:49:34.2783588Z Uninstalling ninja-1.11.1.3: 2025-09-07T07:49:34.2824403Z Successfully uninstalled ninja-1.11.1.3 2025-09-07T07:49:34.3237902Z Successfully installed ninja-1.10.2 2025-09-07T07:49:34.3994792Z + export PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-09-07T07:49:34.3995984Z + PATH=/var/lib/jenkins/.local/bin:/opt/cache/bin:/usr/local/nvidia/bin:/usr/local/cuda/bin:/opt/conda/envs/py_3.9/bin:/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin 2025-09-07T07:49:34.3996553Z + [[ linux-jammy-py3.9-gcc11-build == *aarch64* ]] 2025-09-07T07:49:34.3996807Z + [[ linux-jammy-py3.9-gcc11-build == *asan* ]] 2025-09-07T07:49:34.3997044Z + [[ linux-jammy-py3.9-gcc11-build == *-debug* ]] 2025-09-07T07:49:34.3997280Z + [[ linux-jammy-py3.9-gcc11-build != *-bazel-* ]] 2025-09-07T07:49:34.3997600Z + echo 'We are not in debug mode: linux-jammy-py3.9-gcc11-build. Expect the assertion to pass' 2025-09-07T07:49:34.3997981Z We are not in debug mode: linux-jammy-py3.9-gcc11-build. Expect the assertion to pass 2025-09-07T07:49:34.3998247Z + cd test 2025-09-07T07:49:34.3998474Z + python -c 'import torch; torch._C._crash_if_debug_asserts_fail(424242)' 2025-09-07T07:49:34.6786721Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:49:34.6788242Z import pynvml # type: ignore[import] 2025-09-07T07:49:35.4123448Z + [[ inductor_huggingface_perf_cpu_x86 == \n\o\g\p\u\_\N\O\_\A\V\X\2 ]] 2025-09-07T07:49:35.4123891Z + [[ inductor_huggingface_perf_cpu_x86 == \n\o\g\p\u\_\A\V\X\5\1\2 ]] 2025-09-07T07:49:35.4124273Z + [[ inductor_huggingface_perf_cpu_x86 == \l\e\g\a\c\y\_\n\v\i\d\i\a\_\d\r\i\v\e\r ]] 2025-09-07T07:49:35.4125388Z + DYNAMO_BENCHMARK_FLAGS=() 2025-09-07T07:49:35.4125871Z + [[ inductor_huggingface_perf_cpu_x86 == *pr_time_benchmarks* ]] 2025-09-07T07:49:35.4126226Z + [[ inductor_huggingface_perf_cpu_x86 == *dynamo_eager* ]] 2025-09-07T07:49:35.4126532Z + [[ inductor_huggingface_perf_cpu_x86 == *aot_eager* ]] 2025-09-07T07:49:35.4126828Z + [[ inductor_huggingface_perf_cpu_x86 == *aot_inductor* ]] 2025-09-07T07:49:35.4127160Z + [[ inductor_huggingface_perf_cpu_x86 == *max_autotune_inductor* ]] 2025-09-07T07:49:35.4127477Z + [[ inductor_huggingface_perf_cpu_x86 == *inductor* ]] 2025-09-07T07:49:35.4127752Z + [[ inductor_huggingface_perf_cpu_x86 != *perf* ]] 2025-09-07T07:49:35.4128036Z + [[ inductor_huggingface_perf_cpu_x86 == *dynamic* ]] 2025-09-07T07:49:35.4128308Z + [[ inductor_huggingface_perf_cpu_x86 == *cpu* ]] 2025-09-07T07:49:35.4128596Z + DYNAMO_BENCHMARK_FLAGS+=(--device cpu) 2025-09-07T07:49:35.4155564Z + [[ linux-jammy-py3.9-gcc11-build == *libtorch* ]] 2025-09-07T07:49:35.4155841Z + [[ linux-jammy-py3.9-gcc11-build == *-bazel-* ]] 2025-09-07T07:49:35.4158361Z + cd test 2025-09-07T07:49:35.4159066Z + python -c 'import torch; print(torch.__config__.show())' 2025-09-07T07:49:35.7067735Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:49:35.7069183Z import pynvml # type: ignore[import] 2025-09-07T07:49:36.2817024Z PyTorch built with: 2025-09-07T07:49:36.2817272Z - GCC 11.4 2025-09-07T07:49:36.2817436Z - C++ Version: 201703 2025-09-07T07:49:36.2818221Z - Intel(R) oneAPI Math Kernel Library Version 2024.2-Product Build 20240605 for Intel(R) 64 architecture applications 2025-09-07T07:49:36.2818641Z - Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-09-07T07:49:36.2818907Z - OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-09-07T07:49:36.2819123Z - LAPACK is enabled (usually provided by MKL) 2025-09-07T07:49:36.2819313Z - NNPACK is enabled 2025-09-07T07:49:36.2819488Z - CPU capability usage: AVX512 2025-09-07T07:49:36.2821989Z - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, COMMIT_SHA=93fb23d6fae7c4e82c4239a1033e522088742634, CXX_COMPILER=/opt/cache/bin/c++, CXX_FLAGS= -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOCUPTI -DLIBKINETO_NOROCTRACER -DLIBKINETO_NOXPUPTI=ON -DUSE_FBGEMM -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -DC10_NODEPRECATED -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Werror=range-loop-construct -Werror=bool-operation -Wnarrowing -Wno-missing-field-initializers -Wno-unknown-pragmas -Wno-unused-parameter -Wno-strict-overflow -Wno-strict-aliasing -Wno-stringop-overflow -Wsuggest-override -Wno-psabi -Wno-error=old-style-cast -faligned-new -Werror -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, TORCH_VERSION=2.9.0, USE_CUDA=OFF, USE_CUDNN=OFF, USE_CUSPARSELT=OFF, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_GLOO=ON, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=OFF, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, USE_ROCM_KERNEL_ASSERT=OFF, USE_XCCL=OFF, USE_XPU=OFF, 2025-09-07T07:49:36.2824612Z 2025-09-07T07:49:36.4468704Z + cd test 2025-09-07T07:49:36.4469048Z + python -c 'import torch; print(torch.__config__.parallel_info())' 2025-09-07T07:49:36.7295913Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:49:36.7297335Z import pynvml # type: ignore[import] 2025-09-07T07:49:37.3053758Z ATen/Parallel: 2025-09-07T07:49:37.3054041Z at::get_num_threads() : 48 2025-09-07T07:49:37.3054252Z at::get_num_interop_threads() : 48 2025-09-07T07:49:37.3054449Z OpenMP 201511 (a.k.a. OpenMP 4.5) 2025-09-07T07:49:37.3054642Z omp_get_max_threads() : 48 2025-09-07T07:49:37.3055016Z Intel(R) oneAPI Math Kernel Library Version 2024.2-Product Build 20240605 for Intel(R) 64 architecture applications 2025-09-07T07:49:37.3055337Z mkl_get_max_threads() : 48 2025-09-07T07:49:37.3055581Z Intel(R) MKL-DNN v3.7.1 (Git Hash 8d263e693366ef8db40acc569cc7d8edf644556d) 2025-09-07T07:49:37.3055846Z std::thread::hardware_concurrency() : 96 2025-09-07T07:49:37.3056046Z Environment variables: 2025-09-07T07:49:37.3056209Z OMP_NUM_THREADS : [not set] 2025-09-07T07:49:37.3056413Z MKL_NUM_THREADS : [not set] 2025-09-07T07:49:37.3056602Z ATen parallel backend: OpenMP 2025-09-07T07:49:37.3056717Z 2025-09-07T07:49:37.4677232Z + [[ inductor_huggingface_perf_cpu_x86 == *numpy_2* ]] 2025-09-07T07:49:37.4677587Z + [[ linux-jammy-py3.9-gcc11-build == *aarch64* ]] 2025-09-07T07:49:37.4677885Z + [[ inductor_huggingface_perf_cpu_x86 == *backward* ]] 2025-09-07T07:49:37.4678186Z + [[ inductor_huggingface_perf_cpu_x86 == *xla* ]] 2025-09-07T07:49:37.4678453Z + [[ inductor_huggingface_perf_cpu_x86 == *vllm* ]] 2025-09-07T07:49:37.4678726Z + [[ inductor_huggingface_perf_cpu_x86 == *executorch* ]] 2025-09-07T07:49:37.4679036Z + [[ inductor_huggingface_perf_cpu_x86 == \j\i\t\_\l\e\g\a\c\y ]] 2025-09-07T07:49:37.4679347Z + [[ linux-jammy-py3.9-gcc11-build == *libtorch* ]] 2025-09-07T07:49:37.4679629Z + [[ inductor_huggingface_perf_cpu_x86 == distributed ]] 2025-09-07T07:49:37.4679939Z + [[ inductor_huggingface_perf_cpu_x86 == *operator_benchmark* ]] 2025-09-07T07:49:37.4680593Z + [[ inductor_huggingface_perf_cpu_x86 == *inductor_distributed* ]] 2025-09-07T07:49:37.4682869Z + [[ inductor_huggingface_perf_cpu_x86 == *inductor-halide* ]] 2025-09-07T07:49:37.4683206Z + [[ inductor_huggingface_perf_cpu_x86 == *inductor-triton-cpu* ]] 2025-09-07T07:49:37.4683565Z + [[ inductor_huggingface_perf_cpu_x86 == *inductor-micro-benchmark* ]] 2025-09-07T07:49:37.4683898Z + [[ inductor_huggingface_perf_cpu_x86 == *huggingface* ]] 2025-09-07T07:49:37.4684145Z + install_torchvision 2025-09-07T07:49:37.4684335Z + local orig_preload 2025-09-07T07:49:37.4684510Z + local commit 2025-09-07T07:49:37.4684682Z ++ get_pinned_commit vision 2025-09-07T07:49:37.4684888Z ++ cat .github/ci_commit_pins/vision.txt 2025-09-07T07:49:37.5169849Z + commit=966da7e46f65d6d49df3e31214470a4fe5cc8e66 2025-09-07T07:49:37.5170138Z + orig_preload= 2025-09-07T07:49:37.5170443Z + '[' -n '' ']' 2025-09-07T07:49:37.5170687Z + [[ linux-jammy-py3.9-gcc11-build == *cuda* ]] 2025-09-07T07:49:37.5171224Z + pip_build_and_install git+https://github.com/pytorch/vision.git@966da7e46f65d6d49df3e31214470a4fe5cc8e66 dist/vision 2025-09-07T07:49:37.5171744Z + local build_target=git+https://github.com/pytorch/vision.git@966da7e46f65d6d49df3e31214470a4fe5cc8e66 2025-09-07T07:49:37.5172080Z + local wheel_dir=dist/vision 2025-09-07T07:49:37.5172265Z + local found_whl=0 2025-09-07T07:49:37.5172433Z + for file in "${wheel_dir}"/*.whl 2025-09-07T07:49:37.5172730Z + [[ -f dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl ]] 2025-09-07T07:49:37.5173005Z + found_whl=1 2025-09-07T07:49:37.5173152Z + break 2025-09-07T07:49:37.5173284Z + '[' 1 == 0 ']' 2025-09-07T07:49:37.5173451Z + for file in "${wheel_dir}"/*.whl 2025-09-07T07:49:37.5173751Z + pip_install_whl dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-09-07T07:49:37.5174130Z + args=('dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl') 2025-09-07T07:49:37.5174390Z + local args 2025-09-07T07:49:37.5174643Z + [[ dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl == *\ * ]] 2025-09-07T07:49:37.5174944Z + for path in "${args[@]}" 2025-09-07T07:49:37.5175230Z + echo 'Installing dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl' 2025-09-07T07:49:37.5175614Z Installing dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-09-07T07:49:37.5176070Z + python3 -mpip install --no-index --no-deps dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-09-07T07:49:37.7709950Z Processing ./dist/vision/torchvision-0.22.0a0+966da7e-cp39-cp39-linux_x86_64.whl 2025-09-07T07:49:37.7767924Z Installing collected packages: torchvision 2025-09-07T07:49:38.3583703Z Successfully installed torchvision-0.22.0a0+966da7e 2025-09-07T07:49:38.3904298Z + '[' -n '' ']' 2025-09-07T07:49:38.3904580Z + id=0 2025-09-07T07:49:38.3904784Z + test_dynamo_benchmark huggingface 0 2025-09-07T07:49:38.3905018Z ++ pwd 2025-09-07T07:49:38.3905251Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-09-07T07:49:38.3905513Z + local suite=huggingface 2025-09-07T07:49:38.3905707Z + shift 2025-09-07T07:49:38.3905843Z + local shard_id=0 2025-09-07T07:49:38.3905993Z + shift 2025-09-07T07:49:38.3906183Z + [[ inductor_huggingface_perf_cpu_x86 == *perf_compare* ]] 2025-09-07T07:49:38.3906432Z + [[ inductor_huggingface_perf_cpu_x86 == *perf* ]] 2025-09-07T07:49:38.3906667Z + [[ inductor_huggingface_perf_cpu_x86 == *b200* ]] 2025-09-07T07:49:38.3906914Z + test_single_dynamo_benchmark dashboard huggingface 0 2025-09-07T07:49:38.3907676Z ++ pwd 2025-09-07T07:49:38.3909098Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-09-07T07:49:38.3909383Z + mkdir -p /var/lib/jenkins/workspace/test/test-reports 2025-09-07T07:49:38.3922015Z + local name=dashboard 2025-09-07T07:49:38.3922174Z + shift 2025-09-07T07:49:38.3922310Z + local suite=huggingface 2025-09-07T07:49:38.3922489Z + shift 2025-09-07T07:49:38.3922624Z + local shard_id=0 2025-09-07T07:49:38.3922778Z + shift 2025-09-07T07:49:38.3922913Z + partition_flags=() 2025-09-07T07:49:38.3923295Z + local partition_flags 2025-09-07T07:49:38.3923555Z + [[ -n 3 ]] 2025-09-07T07:49:38.3923701Z + [[ -n 0 ]] 2025-09-07T07:49:38.3923944Z + partition_flags=(--total-partitions "$NUM_TEST_SHARDS" --partition-id "$shard_id") 2025-09-07T07:49:38.3924278Z + [[ inductor_huggingface_perf_cpu_x86 == *perf_compare* ]] 2025-09-07T07:49:38.3924529Z + [[ inductor_huggingface_perf_cpu_x86 == *perf* ]] 2025-09-07T07:49:38.3924853Z + test_perf_for_dashboard huggingface --device cpu --total-partitions 3 --partition-id 0 2025-09-07T07:49:38.3925206Z ++ pwd 2025-09-07T07:49:38.3926700Z + TEST_REPORTS_DIR=/var/lib/jenkins/workspace/test/test-reports 2025-09-07T07:49:38.3926978Z + mkdir -p /var/lib/jenkins/workspace/test/test-reports 2025-09-07T07:49:38.3937871Z + local suite=huggingface 2025-09-07T07:49:38.3938041Z + shift 2025-09-07T07:49:38.3938177Z + local backend=inductor 2025-09-07T07:49:38.3938350Z + modes=() 2025-09-07T07:49:38.3938489Z + local modes 2025-09-07T07:49:38.3938943Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *training-true* ]] 2025-09-07T07:49:38.3939699Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *inference-true* ]] 2025-09-07T07:49:38.3940157Z + modes+=(inference) 2025-09-07T07:49:38.3940331Z + targets=('accuracy' 'performance') 2025-09-07T07:49:38.3940521Z + local targets 2025-09-07T07:49:38.3940675Z + local device=cuda 2025-09-07T07:49:38.3940846Z + [[ inductor_huggingface_perf_cpu_x86 == *cpu* ]] 2025-09-07T07:49:38.3941093Z + [[ inductor_huggingface_perf_cpu_x86 == *cpu_x86_zen* ]] 2025-09-07T07:49:38.3941338Z + [[ inductor_huggingface_perf_cpu_x86 == *cpu_x86* ]] 2025-09-07T07:49:38.3941545Z + device=cpu_x86 2025-09-07T07:49:38.3941703Z + test_inductor_set_cpu_affinity 2025-09-07T07:49:38.3942158Z ++ find /usr/lib -name libjemalloc.so.2 2025-09-07T07:49:38.4125587Z + JEMALLOC_LIB=/usr/lib/x86_64-linux-gnu/libjemalloc.so.2 2025-09-07T07:49:38.4130596Z + export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libjemalloc.so.2: 2025-09-07T07:49:38.4130926Z + LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libjemalloc.so.2: 2025-09-07T07:49:38.4131324Z + export MALLOC_CONF=oversize_threshold:1,background_thread:true,metadata_thp:auto,dirty_decay_ms:-1,muzzy_decay_ms:-1 2025-09-07T07:49:38.4131811Z + MALLOC_CONF=oversize_threshold:1,background_thread:true,metadata_thp:auto,dirty_decay_ms:-1,muzzy_decay_ms:-1 2025-09-07T07:49:38.4132172Z + [[ inductor_huggingface_perf_cpu_x86 != *aarch64* ]] 2025-09-07T07:49:38.4135024Z +++ which python 2025-09-07T07:49:38.4150964Z ++ dirname /opt/conda/envs/py_3.9/bin/python 2025-09-07T07:49:38.4624056Z + IOMP_LIB=/opt/conda/envs/py_3.9/bin/../lib/libiomp5.so 2025-09-07T07:49:38.4624495Z + export LD_PRELOAD=/opt/conda/envs/py_3.9/bin/../lib/libiomp5.so:/usr/lib/x86_64-linux-gnu/libjemalloc.so.2: 2025-09-07T07:49:38.4624948Z + LD_PRELOAD=/opt/conda/envs/py_3.9/bin/../lib/libiomp5.so:/usr/lib/x86_64-linux-gnu/libjemalloc.so.2: 2025-09-07T07:49:38.4625315Z + export KMP_AFFINITY=granularity=fine,compact,1,0 2025-09-07T07:49:38.4625565Z + KMP_AFFINITY=granularity=fine,compact,1,0 2025-09-07T07:49:38.4625780Z + export KMP_BLOCKTIME=1 2025-09-07T07:49:38.4625947Z + KMP_BLOCKTIME=1 2025-09-07T07:49:38.4627714Z ++ nproc 2025-09-07T07:49:38.4645604Z + cpus=96 2025-09-07T07:49:38.4650582Z ++ lscpu 2025-09-07T07:49:38.4651554Z ++ grep 'Thread(s) per core:' 2025-09-07T07:49:38.4652245Z ++ awk '{print $4}' 2025-09-07T07:49:38.5000688Z + thread_per_core=2 2025-09-07T07:49:38.5000903Z + cores=48 2025-09-07T07:49:38.5001206Z + [[ inductor_huggingface_perf_cpu_x86 == *aarch64* ]] 2025-09-07T07:49:38.5001479Z + export OMP_NUM_THREADS=48 2025-09-07T07:49:38.5001693Z + OMP_NUM_THREADS=48 2025-09-07T07:49:38.5001950Z ++ python -c 'import os; print(min(os.sched_getaffinity(0)))' 2025-09-07T07:49:38.5226881Z + start_cpu=0 2025-09-07T07:49:38.5228776Z ++ python -c 'import os; print(max(os.sched_getaffinity(0)))' 2025-09-07T07:49:38.5443850Z + end_cpu=93 2025-09-07T07:49:38.5444371Z + export 'TASKSET=taskset -c 0-93' 2025-09-07T07:49:38.5444706Z + TASKSET='taskset -c 0-93' 2025-09-07T07:49:38.5444908Z + for mode in "${modes[@]}" 2025-09-07T07:49:38.5445091Z + [[ inference == \i\n\f\e\r\e\n\c\e ]] 2025-09-07T07:49:38.5445295Z + [[ cpu_x86 == \c\p\u\_\x\8\6 ]] 2025-09-07T07:49:38.5445466Z + dtype=amp 2025-09-07T07:49:38.5445621Z + for target in "${targets[@]}" 2025-09-07T07:49:38.5445803Z + target_flag=('--accuracy') 2025-09-07T07:49:38.5445971Z + local target_flag 2025-09-07T07:49:38.5446131Z + [[ accuracy == \p\e\r\f\o\r\m\a\n\c\e ]] 2025-09-07T07:49:38.5446326Z + [[ accuracy == \a\c\c\u\r\a\c\y ]] 2025-09-07T07:49:38.5446533Z + target_flag+=(--no-translation-validation) 2025-09-07T07:49:38.5447037Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *freezing-true* ]] 2025-09-07T07:49:38.5447493Z + target_flag+=(--freezing) 2025-09-07T07:49:38.5447937Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *default-true* ]] 2025-09-07T07:49:38.5449076Z + taskset -c 0-93 python benchmarks/dynamo/huggingface.py --accuracy --no-translation-validation --freezing --inference --amp --backend inductor --disable-cudagraphs --device cpu --total-partitions 3 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_accuracy.csv 2025-09-07T07:49:39.2455199Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:49:39.2456138Z import pynvml # type: ignore[import] 2025-09-07T07:49:41.5087280Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T07:49:41.5088152Z from pkg_resources import resource_filename 2025-09-07T07:49:41.8973179Z 2025-09-07T07:49:41.9001371Z config.json: 0% 0.00/694 [00:00bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7491555Z 2025-09-07T07:58:50.7491630Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7491831Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7492052Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7492527Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7492986Z layer_outputs = layer_module( 2025-09-07T07:58:50.7493311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7493653Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7494046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7494425Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7494815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7495196Z self_outputs = self.self( 2025-09-07T07:58:50.7495562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7495985Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7496441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7496977Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7497206Z 2025-09-07T07:58:50.7497302Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7497782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7498240Z layer_outputs = layer_module( 2025-09-07T07:58:50.7498551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7498892Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7499275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7499663Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7500048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7500429Z self_outputs = self.self( 2025-09-07T07:58:50.7500807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7501220Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7501686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7502235Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7502457Z 2025-09-07T07:58:50.7502554Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7503030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7503474Z layer_outputs = layer_module( 2025-09-07T07:58:50.7503800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7504145Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7505605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7506070Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7506459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7506843Z self_outputs = self.self( 2025-09-07T07:58:50.7507207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7507615Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7508078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7508644Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7508867Z 2025-09-07T07:58:50.7508944Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7509142Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7509333Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7509523Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7509732Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7510211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7510664Z layer_outputs = layer_module( 2025-09-07T07:58:50.7510992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7511330Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7511705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7512093Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7512488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7512876Z self_outputs = self.self( 2025-09-07T07:58:50.7513244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 536, in forward 2025-09-07T07:58:50.7513654Z diagonal_mask = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7514119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 834, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7514625Z self._mask_invalid_locations(diagonal_attention_scores, window_overlap) 2025-09-07T07:58:50.7515118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 762, in _mask_invalid_locations 2025-09-07T07:58:50.7515629Z input_tensor[:, :affected_seq_len, :, : affected_seq_len + 1] = torch.full_like( 2025-09-07T07:58:50.7515821Z 2025-09-07T07:58:50.7515894Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7516114Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7516589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7517034Z layer_outputs = layer_module( 2025-09-07T07:58:50.7517356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7517684Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7518070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7518455Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7518875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7519300Z self_outputs = self.self( 2025-09-07T07:58:50.7519668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7520051Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7520165Z 2025-09-07T07:58:50.7520270Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7520747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7521193Z layer_outputs = layer_module( 2025-09-07T07:58:50.7521525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7521864Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7522254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7522642Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7523022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7523402Z self_outputs = self.self( 2025-09-07T07:58:50.7523765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7524150Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7524274Z 2025-09-07T07:58:50.7524355Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7524544Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7524764Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7525243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7525694Z layer_outputs = layer_module( 2025-09-07T07:58:50.7526010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7526347Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7526733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7527117Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7527500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7527868Z self_outputs = self.self( 2025-09-07T07:58:50.7528236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7528660Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7529150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7529692Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7530082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7530408Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7530557Z 2025-09-07T07:58:50.7530653Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7531124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7531569Z layer_outputs = layer_module( 2025-09-07T07:58:50.7531911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7532278Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7532663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7533051Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7533434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7533804Z self_outputs = self.self( 2025-09-07T07:58:50.7534176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7534601Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7535090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7535598Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7536061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7536497Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7536817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7537147Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7537286Z 2025-09-07T07:58:50.7537380Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7537853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7538299Z layer_outputs = layer_module( 2025-09-07T07:58:50.7538625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7538965Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7539342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7539733Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7540119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7540499Z self_outputs = self.self( 2025-09-07T07:58:50.7540871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7541283Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7541775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7542299Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7542487Z 2025-09-07T07:58:50.7542591Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7543059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7543496Z layer_outputs = layer_module( 2025-09-07T07:58:50.7543818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7544152Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7544537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7544920Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7545323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7545733Z self_outputs = self.self( 2025-09-07T07:58:50.7546105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7546527Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7547015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7547525Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7547718Z 2025-09-07T07:58:50.7547816Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7548289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7548739Z layer_outputs = layer_module( 2025-09-07T07:58:50.7549059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7549391Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7549778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7550162Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7550546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7550921Z self_outputs = self.self( 2025-09-07T07:58:50.7551281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7551769Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7551995Z 2025-09-07T07:58:50.7552070Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7552271Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7552457Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7552648Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7552838Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7553027Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7553239Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7553715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7554162Z layer_outputs = layer_module( 2025-09-07T07:58:50.7554485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7554825Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7555206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7555592Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7555980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7556363Z self_outputs = self.self( 2025-09-07T07:58:50.7556731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7557122Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7557263Z 2025-09-07T07:58:50.7557337Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7557533Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7557750Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7558247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7558722Z layer_outputs = layer_module( 2025-09-07T07:58:50.7559045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7559392Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7559786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7560170Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7560562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7560952Z self_outputs = self.self( 2025-09-07T07:58:50.7561333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7561753Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7562211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7562759Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7562995Z 2025-09-07T07:58:50.7563073Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7563273Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7563503Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7563977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7564435Z layer_outputs = layer_module( 2025-09-07T07:58:50.7564762Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7565106Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7565492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7565881Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7566271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7566655Z self_outputs = self.self( 2025-09-07T07:58:50.7567028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7567437Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7567903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7568455Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7568680Z 2025-09-07T07:58:50.7568789Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7569269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7569713Z layer_outputs = layer_module( 2025-09-07T07:58:50.7570039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7570381Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7570772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7571163Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7571600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7572016Z self_outputs = self.self( 2025-09-07T07:58:50.7572383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7572794Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7573256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7573785Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7574020Z 2025-09-07T07:58:50.7574115Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7574593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7575064Z layer_outputs = layer_module( 2025-09-07T07:58:50.7575390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7575724Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7576113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7576503Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7576890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7577273Z self_outputs = self.self( 2025-09-07T07:58:50.7577632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7578045Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7578512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7579050Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7579270Z 2025-09-07T07:58:50.7579353Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7579546Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7579769Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7580246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7580697Z layer_outputs = layer_module( 2025-09-07T07:58:50.7581054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7581398Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7581790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7582184Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7582569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7582946Z self_outputs = self.self( 2025-09-07T07:58:50.7583322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7583704Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7583817Z 2025-09-07T07:58:50.7583924Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7584402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7584952Z layer_outputs = layer_module( 2025-09-07T07:58:50.7585276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7585615Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7586005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7586396Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7586779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7587168Z self_outputs = self.self( 2025-09-07T07:58:50.7587544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7587941Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7588068Z 2025-09-07T07:58:50.7588147Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7588671Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7588900Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7589376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7589832Z layer_outputs = layer_module( 2025-09-07T07:58:50.7590155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7590505Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7590901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7591298Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7591694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7592076Z self_outputs = self.self( 2025-09-07T07:58:50.7592450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7592880Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7593372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7593919Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7594311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7594648Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7594800Z 2025-09-07T07:58:50.7594901Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7595392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7595845Z layer_outputs = layer_module( 2025-09-07T07:58:50.7596165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7596512Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7596908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7597296Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7597678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7598064Z self_outputs = self.self( 2025-09-07T07:58:50.7598475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7599379Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7599866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7600366Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7600839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7601278Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7601597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7601927Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7602067Z 2025-09-07T07:58:50.7602167Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7602642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7603088Z layer_outputs = layer_module( 2025-09-07T07:58:50.7603413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7603746Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7604125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7604507Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7604892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7605273Z self_outputs = self.self( 2025-09-07T07:58:50.7605646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7606062Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7606552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7607071Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7607261Z 2025-09-07T07:58:50.7607367Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7607846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7608288Z layer_outputs = layer_module( 2025-09-07T07:58:50.7608612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7608948Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7609341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7609728Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7610104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7610487Z self_outputs = self.self( 2025-09-07T07:58:50.7610855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7611276Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7611758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7612297Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7612521Z 2025-09-07T07:58:50.7612620Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7613092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7613544Z layer_outputs = layer_module( 2025-09-07T07:58:50.7613863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7614193Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7614584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7614968Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7615360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7615735Z self_outputs = self.self( 2025-09-07T07:58:50.7616104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7616592Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7616817Z 2025-09-07T07:58:50.7616891Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7617089Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7617278Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7617473Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7617665Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7617855Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7618064Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7618543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7618996Z layer_outputs = layer_module( 2025-09-07T07:58:50.7619316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7619653Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7620031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7620423Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7620805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7621183Z self_outputs = self.self( 2025-09-07T07:58:50.7621551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7621939Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7622077Z 2025-09-07T07:58:50.7622149Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7622343Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7622561Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7623026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7623474Z layer_outputs = layer_module( 2025-09-07T07:58:50.7623796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7624131Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7624516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7624892Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7625393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7625806Z self_outputs = self.self( 2025-09-07T07:58:50.7626177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7626595Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7627052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7627595Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7627831Z 2025-09-07T07:58:50.7627904Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7628107Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7628335Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7628812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7629264Z layer_outputs = layer_module( 2025-09-07T07:58:50.7629586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7629924Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7630306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7630695Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7631084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7631466Z self_outputs = self.self( 2025-09-07T07:58:50.7631833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7632235Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7632698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7633238Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7633462Z 2025-09-07T07:58:50.7633568Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7634041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7634483Z layer_outputs = layer_module( 2025-09-07T07:58:50.7634805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7635144Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7635534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7635915Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7636295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7636676Z self_outputs = self.self( 2025-09-07T07:58:50.7637043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7637453Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7637909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7638472Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7638727Z 2025-09-07T07:58:50.7638824Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7639300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7639753Z layer_outputs = layer_module( 2025-09-07T07:58:50.7640078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7640410Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7640802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7641186Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7641572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7641961Z self_outputs = self.self( 2025-09-07T07:58:50.7642323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7642740Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7643201Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7643739Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7643959Z 2025-09-07T07:58:50.7644040Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7644235Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7644468Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7644950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7645405Z layer_outputs = layer_module( 2025-09-07T07:58:50.7645719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7646059Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7646450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7646840Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7647229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7647601Z self_outputs = self.self( 2025-09-07T07:58:50.7647973Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7648359Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7648475Z 2025-09-07T07:58:50.7648579Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7649055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7649495Z layer_outputs = layer_module( 2025-09-07T07:58:50.7649819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7650156Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7650548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7650933Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7651311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7651723Z self_outputs = self.self( 2025-09-07T07:58:50.7652124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7652512Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7652639Z 2025-09-07T07:58:50.7652718Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7652925Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7653149Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7653628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7654075Z layer_outputs = layer_module( 2025-09-07T07:58:50.7654392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7654732Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7655125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7655515Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7655894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7656280Z self_outputs = self.self( 2025-09-07T07:58:50.7656651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7657082Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7657570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7658116Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7658517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7658850Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7658999Z 2025-09-07T07:58:50.7659097Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7659580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7660022Z layer_outputs = layer_module( 2025-09-07T07:58:50.7660350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7660696Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7661090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7661480Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7661867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7662258Z self_outputs = self.self( 2025-09-07T07:58:50.7662634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7663060Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7663549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7664054Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7664531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7664968Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7665367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7665689Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7665830Z 2025-09-07T07:58:50.7665928Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7666405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7666854Z layer_outputs = layer_module( 2025-09-07T07:58:50.7667182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7667520Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7667898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7668292Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7668681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7669065Z self_outputs = self.self( 2025-09-07T07:58:50.7669428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7669852Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7670338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7670856Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7671048Z 2025-09-07T07:58:50.7671155Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7671627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7672067Z layer_outputs = layer_module( 2025-09-07T07:58:50.7672389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7672724Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7673114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7673496Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7673885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7674262Z self_outputs = self.self( 2025-09-07T07:58:50.7674633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7675056Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7675543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7676063Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7676257Z 2025-09-07T07:58:50.7676356Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7676832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7677278Z layer_outputs = layer_module( 2025-09-07T07:58:50.7677592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7677930Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7678351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7678770Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7679157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7679528Z self_outputs = self.self( 2025-09-07T07:58:50.7679897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7680383Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7680603Z 2025-09-07T07:58:50.7680687Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7680888Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7681115Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7681310Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7681505Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7681702Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7681922Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7682424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7682964Z layer_outputs = layer_module( 2025-09-07T07:58:50.7683303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7683655Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7684049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7684452Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7684856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7685253Z self_outputs = self.self( 2025-09-07T07:58:50.7685625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7686030Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7686170Z 2025-09-07T07:58:50.7686243Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7686448Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7686669Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7687147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7687606Z layer_outputs = layer_module( 2025-09-07T07:58:50.7687932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7688282Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7688678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7689061Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7689453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7689846Z self_outputs = self.self( 2025-09-07T07:58:50.7690222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7690632Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7691105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7691725Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7692019Z 2025-09-07T07:58:50.7692094Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7692296Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7692517Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7693010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7693473Z layer_outputs = layer_module( 2025-09-07T07:58:50.7693805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7694160Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7694551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7694958Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7695362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7695757Z self_outputs = self.self( 2025-09-07T07:58:50.7696128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7696532Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7696996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7697538Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7697758Z 2025-09-07T07:58:50.7697864Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7698338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7698783Z layer_outputs = layer_module( 2025-09-07T07:58:50.7699102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7699441Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7699827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7700216Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7700594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7700976Z self_outputs = self.self( 2025-09-07T07:58:50.7701344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7701756Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7702211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7702752Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7702980Z 2025-09-07T07:58:50.7703079Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7703556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7704005Z layer_outputs = layer_module( 2025-09-07T07:58:50.7704331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7704660Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7705085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7705502Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7705884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7706254Z self_outputs = self.self( 2025-09-07T07:58:50.7706624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7707032Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7707490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7708021Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7708241Z 2025-09-07T07:58:50.7708315Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7708513Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7708733Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7709206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7709644Z layer_outputs = layer_module( 2025-09-07T07:58:50.7709956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7710290Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7710677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7711064Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7711452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7711833Z self_outputs = self.self( 2025-09-07T07:58:50.7712199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7712580Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7712691Z 2025-09-07T07:58:50.7712794Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7713262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7713710Z layer_outputs = layer_module( 2025-09-07T07:58:50.7714031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7714373Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7714761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7715139Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7715526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7715904Z self_outputs = self.self( 2025-09-07T07:58:50.7716273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7716656Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7716780Z 2025-09-07T07:58:50.7716855Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7717053Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7717277Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7717747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7718272Z layer_outputs = layer_module( 2025-09-07T07:58:50.7718588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7718927Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7719313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7719694Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7720071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7720450Z self_outputs = self.self( 2025-09-07T07:58:50.7720817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7721232Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7721716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7722246Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7722637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7722960Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7723098Z 2025-09-07T07:58:50.7723201Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7723672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7724113Z layer_outputs = layer_module( 2025-09-07T07:58:50.7724435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7724778Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7725165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7725553Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7725927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7726308Z self_outputs = self.self( 2025-09-07T07:58:50.7726672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7727089Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7727575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7728069Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7728541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7728974Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7729293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7729615Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7729753Z 2025-09-07T07:58:50.7729850Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7730327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7730778Z layer_outputs = layer_module( 2025-09-07T07:58:50.7731135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7731525Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7731913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7732299Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7732685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7733064Z self_outputs = self.self( 2025-09-07T07:58:50.7733428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7733856Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7734345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7734876Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7735068Z 2025-09-07T07:58:50.7735173Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7735643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7736090Z layer_outputs = layer_module( 2025-09-07T07:58:50.7736410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7736747Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7737138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7737516Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7737903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7738288Z self_outputs = self.self( 2025-09-07T07:58:50.7738656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7739075Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7739551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7740072Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7740265Z 2025-09-07T07:58:50.7740362Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7740838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7741288Z layer_outputs = layer_module( 2025-09-07T07:58:50.7741600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7741937Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7742321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7742702Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7743084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7743452Z self_outputs = self.self( 2025-09-07T07:58:50.7743819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7744302Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7744577Z 2025-09-07T07:58:50.7744662Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7744859Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7745047Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7745236Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7745426Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7745619Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7745831Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7746306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7746754Z layer_outputs = layer_module( 2025-09-07T07:58:50.7747077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7747409Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7747800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7748191Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7748578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7748954Z self_outputs = self.self( 2025-09-07T07:58:50.7749311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7749705Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7749840Z 2025-09-07T07:58:50.7749911Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7750107Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7750315Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7750793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7751248Z layer_outputs = layer_module( 2025-09-07T07:58:50.7751573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7751905Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7752284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7752670Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7753062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7753451Z self_outputs = self.self( 2025-09-07T07:58:50.7753827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7754242Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7754721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7755276Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7755507Z 2025-09-07T07:58:50.7755590Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7755789Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7756008Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7756495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7756956Z layer_outputs = layer_module( 2025-09-07T07:58:50.7757312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7757688Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7758082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7758477Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7758877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7759268Z self_outputs = self.self( 2025-09-07T07:58:50.7759640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7760062Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7760534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7761090Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7761320Z 2025-09-07T07:58:50.7761422Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7761899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7762359Z layer_outputs = layer_module( 2025-09-07T07:58:50.7762689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7763038Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7763434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7763823Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7764220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7764611Z self_outputs = self.self( 2025-09-07T07:58:50.7764991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7765410Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7765867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7766422Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7766659Z 2025-09-07T07:58:50.7766758Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7767240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7767700Z layer_outputs = layer_module( 2025-09-07T07:58:50.7768023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7768367Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7768766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7769164Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7769558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7769937Z self_outputs = self.self( 2025-09-07T07:58:50.7770314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7770734Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7771240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7771824Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7772051Z 2025-09-07T07:58:50.7772126Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7772333Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7772555Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7773028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7773480Z layer_outputs = layer_module( 2025-09-07T07:58:50.7773796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7774137Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7774529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7774918Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7775299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7775677Z self_outputs = self.self( 2025-09-07T07:58:50.7776041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7776423Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7776534Z 2025-09-07T07:58:50.7776638Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7777101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7777546Z layer_outputs = layer_module( 2025-09-07T07:58:50.7777876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7778215Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7778605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7778982Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7779366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7779750Z self_outputs = self.self( 2025-09-07T07:58:50.7780118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7780508Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7780633Z 2025-09-07T07:58:50.7780710Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7780955Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7781187Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7781678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7782128Z layer_outputs = layer_module( 2025-09-07T07:58:50.7782468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7782815Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7783216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7783608Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7783990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7784467Z self_outputs = self.self( 2025-09-07T07:58:50.7784835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7785260Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7785759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7786293Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7786695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7787031Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7787169Z 2025-09-07T07:58:50.7787271Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7787746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7788188Z layer_outputs = layer_module( 2025-09-07T07:58:50.7788507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7788841Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7789226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7789598Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7789981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7790358Z self_outputs = self.self( 2025-09-07T07:58:50.7790727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7791150Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7791627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7792125Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7792591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7793019Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7793330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7793645Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7793795Z 2025-09-07T07:58:50.7793892Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7794367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7794820Z layer_outputs = layer_module( 2025-09-07T07:58:50.7795138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7795463Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7795851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7796231Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7796617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7796999Z self_outputs = self.self( 2025-09-07T07:58:50.7797390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7797848Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7798335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7798860Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7799051Z 2025-09-07T07:58:50.7799154Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7799621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7800076Z layer_outputs = layer_module( 2025-09-07T07:58:50.7800398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7800738Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7801133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7801514Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7801900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7802279Z self_outputs = self.self( 2025-09-07T07:58:50.7802651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7803067Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7803552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7804068Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7804270Z 2025-09-07T07:58:50.7804368Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7804845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7805291Z layer_outputs = layer_module( 2025-09-07T07:58:50.7805609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7805950Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7806340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7806726Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7807102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7807487Z self_outputs = self.self( 2025-09-07T07:58:50.7807864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7808357Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7808579Z 2025-09-07T07:58:50.7808661Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7808854Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7809051Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7809244Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7809437Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7809621Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7809837Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7810345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7810820Z layer_outputs = layer_module( 2025-09-07T07:58:50.7837243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7837651Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7838067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7838482Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7838890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7839286Z self_outputs = self.self( 2025-09-07T07:58:50.7839672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7840066Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7840210Z 2025-09-07T07:58:50.7840310Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7840519Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7840747Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7841241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7841696Z layer_outputs = layer_module( 2025-09-07T07:58:50.7842028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7842380Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7842773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7843159Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7843557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7843949Z self_outputs = self.self( 2025-09-07T07:58:50.7844329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7844749Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7845218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7845770Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7846002Z 2025-09-07T07:58:50.7846081Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7846283Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7846509Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7846999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7847453Z layer_outputs = layer_module( 2025-09-07T07:58:50.7847773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7848113Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7848504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7848889Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7849273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7849656Z self_outputs = self.self( 2025-09-07T07:58:50.7850048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7850608Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7851073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7851609Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7851839Z 2025-09-07T07:58:50.7851940Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7852419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7852863Z layer_outputs = layer_module( 2025-09-07T07:58:50.7853189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7853519Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7853910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7854296Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7854683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7855061Z self_outputs = self.self( 2025-09-07T07:58:50.7855421Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7855830Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7856291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7856834Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7857057Z 2025-09-07T07:58:50.7857166Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7857630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7858078Z layer_outputs = layer_module( 2025-09-07T07:58:50.7858402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7858742Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7859133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7859513Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7859897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7860271Z self_outputs = self.self( 2025-09-07T07:58:50.7860644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7861051Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7861500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7862031Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7862257Z 2025-09-07T07:58:50.7862330Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7862533Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7862747Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7863220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7863746Z layer_outputs = layer_module( 2025-09-07T07:58:50.7864074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7864409Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7864787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7865175Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7865560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7865937Z self_outputs = self.self( 2025-09-07T07:58:50.7866306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7866682Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7866802Z 2025-09-07T07:58:50.7866900Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7867373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7867816Z layer_outputs = layer_module( 2025-09-07T07:58:50.7868131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7868459Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7868847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7869231Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7869615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7869995Z self_outputs = self.self( 2025-09-07T07:58:50.7870360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7870750Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7870880Z 2025-09-07T07:58:50.7870954Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7871152Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7871364Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7871836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7872286Z layer_outputs = layer_module( 2025-09-07T07:58:50.7872611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7872945Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7873327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7873716Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7874099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7874483Z self_outputs = self.self( 2025-09-07T07:58:50.7874842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7875270Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7875764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7876308Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7876734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7877103Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7877243Z 2025-09-07T07:58:50.7877344Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7877823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7878275Z layer_outputs = layer_module( 2025-09-07T07:58:50.7878603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7878933Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7879328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7879720Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7880117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7880505Z self_outputs = self.self( 2025-09-07T07:58:50.7880881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7881361Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7881868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7882392Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7882881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7883322Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7883657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7883994Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7884137Z 2025-09-07T07:58:50.7884250Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7884744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7885200Z layer_outputs = layer_module( 2025-09-07T07:58:50.7885538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7885891Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7886291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7886689Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7887082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7887484Z self_outputs = self.self( 2025-09-07T07:58:50.7887864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7888297Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7888797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7888950Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7888954Z 2025-09-07T07:58:50.7889052Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7889460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7889574Z layer_outputs = layer_module( 2025-09-07T07:58:50.7889793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7889869Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7890135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7890216Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7890482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7890558Z self_outputs = self.self( 2025-09-07T07:58:50.7890826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7890944Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7891289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7891434Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7891437Z 2025-09-07T07:58:50.7891550Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7891882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7891962Z layer_outputs = layer_module( 2025-09-07T07:58:50.7892172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7892251Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7892528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7892604Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7892879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7892943Z self_outputs = self.self( 2025-09-07T07:58:50.7893219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7893399Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7893403Z 2025-09-07T07:58:50.7893480Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893567Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893640Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893722Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893792Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893869Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7893976Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7894310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7894383Z layer_outputs = layer_module( 2025-09-07T07:58:50.7894594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7894669Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7894938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7895011Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7895284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7895412Z self_outputs = self.self( 2025-09-07T07:58:50.7895675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7895766Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7895769Z 2025-09-07T07:58:50.7895844Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7895926Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7896022Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7896358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7896424Z layer_outputs = layer_module( 2025-09-07T07:58:50.7896631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7896715Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7896983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7897061Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7897322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7897388Z self_outputs = self.self( 2025-09-07T07:58:50.7897660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7897756Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7898083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7898255Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7898264Z 2025-09-07T07:58:50.7898346Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7898419Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7898516Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7898855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7898922Z layer_outputs = layer_module( 2025-09-07T07:58:50.7899137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7899212Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7899472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7899550Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7899815Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7899890Z self_outputs = self.self( 2025-09-07T07:58:50.7900155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7900259Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7900577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7900748Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7900752Z 2025-09-07T07:58:50.7900857Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7901215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7901319Z layer_outputs = layer_module( 2025-09-07T07:58:50.7901528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7901611Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7901874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7901943Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7902214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7902280Z self_outputs = self.self( 2025-09-07T07:58:50.7902548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7902643Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7902964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7903143Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7903147Z 2025-09-07T07:58:50.7903243Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7903577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7903643Z layer_outputs = layer_module( 2025-09-07T07:58:50.7903855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7903931Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7904198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7904278Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7904539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7904611Z self_outputs = self.self( 2025-09-07T07:58:50.7904872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7904972Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7905291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7905459Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7905463Z 2025-09-07T07:58:50.7905546Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7905622Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7905731Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7906056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7906125Z layer_outputs = layer_module( 2025-09-07T07:58:50.7906340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7906416Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7906685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7906756Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7907025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7907502Z self_outputs = self.self( 2025-09-07T07:58:50.7907797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7907880Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7907883Z 2025-09-07T07:58:50.7907982Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7908323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7908394Z layer_outputs = layer_module( 2025-09-07T07:58:50.7908603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7908690Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7908954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7909038Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7909301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7909376Z self_outputs = self.self( 2025-09-07T07:58:50.7909638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7909713Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7909716Z 2025-09-07T07:58:50.7909801Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7909873Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7909971Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7910303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7910369Z layer_outputs = layer_module( 2025-09-07T07:58:50.7910593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7910668Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7910938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7911008Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7911277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7911343Z self_outputs = self.self( 2025-09-07T07:58:50.7911604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7911721Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7912053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7912230Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7912414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7912509Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7912520Z 2025-09-07T07:58:50.7912617Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7912940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7913015Z layer_outputs = layer_module( 2025-09-07T07:58:50.7913219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7913331Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7913621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7913690Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7913962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7914028Z self_outputs = self.self( 2025-09-07T07:58:50.7914295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7914400Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7914726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7914865Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7915165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7915260Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7915438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7915540Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7915543Z 2025-09-07T07:58:50.7915637Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7915962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7916037Z layer_outputs = layer_module( 2025-09-07T07:58:50.7916242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7916326Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7916593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7916670Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7916933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7916997Z self_outputs = self.self( 2025-09-07T07:58:50.7917267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7917376Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7917715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7917857Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7917862Z 2025-09-07T07:58:50.7917968Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7918300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7918368Z layer_outputs = layer_module( 2025-09-07T07:58:50.7918584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7918657Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7918926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7918997Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7919308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7919411Z self_outputs = self.self( 2025-09-07T07:58:50.7919674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7919791Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7920124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7920271Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7920274Z 2025-09-07T07:58:50.7920370Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7920703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7920777Z layer_outputs = layer_module( 2025-09-07T07:58:50.7920993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7921075Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7921338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7921416Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7921681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7921746Z self_outputs = self.self( 2025-09-07T07:58:50.7922017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7922197Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7922200Z 2025-09-07T07:58:50.7922286Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922363Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922434Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922515Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922586Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922666Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7922761Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7923092Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7923171Z layer_outputs = layer_module( 2025-09-07T07:58:50.7923378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7923462Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7923734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7923807Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7924075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7924139Z self_outputs = self.self( 2025-09-07T07:58:50.7924409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7924492Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7924495Z 2025-09-07T07:58:50.7924573Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7924644Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7924739Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7925109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7925205Z layer_outputs = layer_module( 2025-09-07T07:58:50.7925425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7925501Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7925759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7925836Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7926095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7926168Z self_outputs = self.self( 2025-09-07T07:58:50.7926435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7926536Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7926858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7927033Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7927036Z 2025-09-07T07:58:50.7927117Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7927192Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7927299Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7927629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7927694Z layer_outputs = layer_module( 2025-09-07T07:58:50.7927913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7927991Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7928259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7928328Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7928602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7928668Z self_outputs = self.self( 2025-09-07T07:58:50.7928934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7929036Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7929353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7929532Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7929538Z 2025-09-07T07:58:50.7929634Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7929966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7930036Z layer_outputs = layer_module( 2025-09-07T07:58:50.7930242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7930326Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7930592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7930670Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7930928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7931057Z self_outputs = self.self( 2025-09-07T07:58:50.7931326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7931421Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7931746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7931915Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7931919Z 2025-09-07T07:58:50.7932022Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7932351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7932416Z layer_outputs = layer_module( 2025-09-07T07:58:50.7932634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7932714Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7932984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7933052Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7933320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7933385Z self_outputs = self.self( 2025-09-07T07:58:50.7933647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7933748Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7934070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7934248Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7934251Z 2025-09-07T07:58:50.7934325Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7934397Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7934500Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7934831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7934905Z layer_outputs = layer_module( 2025-09-07T07:58:50.7935112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7935195Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7935463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7935534Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7935807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7935870Z self_outputs = self.self( 2025-09-07T07:58:50.7936142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7936211Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7936214Z 2025-09-07T07:58:50.7936311Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7936650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7936718Z layer_outputs = layer_module( 2025-09-07T07:58:50.7936962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7937070Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7937347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7937420Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7937686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7937764Z self_outputs = self.self( 2025-09-07T07:58:50.7938030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7938117Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7938120Z 2025-09-07T07:58:50.7938197Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7938272Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7938387Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7938715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7938794Z layer_outputs = layer_module( 2025-09-07T07:58:50.7939004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7939088Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7939353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7939426Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7939696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7939763Z self_outputs = self.self( 2025-09-07T07:58:50.7940040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7940155Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7940486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7940659Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7940844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7940946Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7940950Z 2025-09-07T07:58:50.7941050Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7941390Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7941462Z layer_outputs = layer_module( 2025-09-07T07:58:50.7941669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7941756Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7942020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7942100Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7942372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7942440Z self_outputs = self.self( 2025-09-07T07:58:50.7942710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7942849Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7943204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7943340Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7943637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7943732Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7943912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7944004Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7944015Z 2025-09-07T07:58:50.7944111Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7944443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7944520Z layer_outputs = layer_module( 2025-09-07T07:58:50.7944727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7944810Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7945072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7945142Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7945411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7945474Z self_outputs = self.self( 2025-09-07T07:58:50.7945741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7945849Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7946187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7946327Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7946331Z 2025-09-07T07:58:50.7946425Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7946759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7946824Z layer_outputs = layer_module( 2025-09-07T07:58:50.7947036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7947110Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7947373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7947454Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7947714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7947786Z self_outputs = self.self( 2025-09-07T07:58:50.7948048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7948160Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7948489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7948628Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7948631Z 2025-09-07T07:58:50.7948765Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7949133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7949204Z layer_outputs = layer_module( 2025-09-07T07:58:50.7949408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7949490Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7949753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7949823Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7950090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7950153Z self_outputs = self.self( 2025-09-07T07:58:50.7950419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7950597Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7950600Z 2025-09-07T07:58:50.7950681Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7950754Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7950825Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7950901Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7950972Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7951042Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7951145Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7951474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7951548Z layer_outputs = layer_module( 2025-09-07T07:58:50.7951758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7951837Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7952098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7952167Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7952431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7952495Z self_outputs = self.self( 2025-09-07T07:58:50.7952760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7952839Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7952842Z 2025-09-07T07:58:50.7952911Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7952994Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7953089Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7953423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7953489Z layer_outputs = layer_module( 2025-09-07T07:58:50.7953700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7953775Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7954034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7954111Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7954399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7954494Z self_outputs = self.self( 2025-09-07T07:58:50.7954749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7954843Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7955165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7955337Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7955341Z 2025-09-07T07:58:50.7955421Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7955493Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7955597Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7955926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7955995Z layer_outputs = layer_module( 2025-09-07T07:58:50.7956206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7956281Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7956548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7956616Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7956877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7956953Z self_outputs = self.self( 2025-09-07T07:58:50.7957213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7957315Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7957636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7957813Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7957816Z 2025-09-07T07:58:50.7957909Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7958235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7958308Z layer_outputs = layer_module( 2025-09-07T07:58:50.7958512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7958589Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7958851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7958928Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7959190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7959253Z self_outputs = self.self( 2025-09-07T07:58:50.7959517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7959611Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7959935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7960112Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7960115Z 2025-09-07T07:58:50.7960237Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7960590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7960654Z layer_outputs = layer_module( 2025-09-07T07:58:50.7960861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7960934Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7961195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7961263Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7961517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7961586Z self_outputs = self.self( 2025-09-07T07:58:50.7961843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7961943Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7962250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7962418Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7962421Z 2025-09-07T07:58:50.7962490Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7962559Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7962658Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7962977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7963044Z layer_outputs = layer_module( 2025-09-07T07:58:50.7963247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7963319Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7963579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7963647Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7963906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7963966Z self_outputs = self.self( 2025-09-07T07:58:50.7964219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7964291Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7964295Z 2025-09-07T07:58:50.7964388Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7964716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7964780Z layer_outputs = layer_module( 2025-09-07T07:58:50.7964983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7965053Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7965306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7965379Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7965631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7965698Z self_outputs = self.self( 2025-09-07T07:58:50.7965978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7966113Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7966116Z 2025-09-07T07:58:50.7966188Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7966259Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7966360Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7966683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7966755Z layer_outputs = layer_module( 2025-09-07T07:58:50.7966958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7967030Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7967299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7967371Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7967635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7967699Z self_outputs = self.self( 2025-09-07T07:58:50.7967952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7968069Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7968392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7968560Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7968740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7968847Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7968850Z 2025-09-07T07:58:50.7968944Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7969269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7969344Z layer_outputs = layer_module( 2025-09-07T07:58:50.7969547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7969628Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7969885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7969961Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7970222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7970289Z self_outputs = self.self( 2025-09-07T07:58:50.7970554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7970659Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7970990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7971115Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7971405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7971499Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7971713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7971846Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7971849Z 2025-09-07T07:58:50.7971944Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7972274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7972340Z layer_outputs = layer_module( 2025-09-07T07:58:50.7972543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7972625Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7972883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7972960Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7973219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7973292Z self_outputs = self.self( 2025-09-07T07:58:50.7973545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7973650Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7973979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7974116Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7974120Z 2025-09-07T07:58:50.7974218Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7974540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7974607Z layer_outputs = layer_module( 2025-09-07T07:58:50.7974820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7974893Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7975154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7975224Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7975487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7975550Z self_outputs = self.self( 2025-09-07T07:58:50.7975804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7975918Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7976245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7976393Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.7976396Z 2025-09-07T07:58:50.7976491Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7976818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7976884Z layer_outputs = layer_module( 2025-09-07T07:58:50.7977085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7977167Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7977422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7977556Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7977818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7977880Z self_outputs = self.self( 2025-09-07T07:58:50.7978147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.7978321Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.7978324Z 2025-09-07T07:58:50.7978409Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978480Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978558Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978626Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978694Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978770Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7978869Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7979199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7979262Z layer_outputs = layer_module( 2025-09-07T07:58:50.7979462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7979543Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7979796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7979873Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7980129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7980191Z self_outputs = self.self( 2025-09-07T07:58:50.7980459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.7980541Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.7980544Z 2025-09-07T07:58:50.7980623Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7980696Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7980799Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7981158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7981227Z layer_outputs = layer_module( 2025-09-07T07:58:50.7981442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7981518Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7981799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7981871Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7982126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7982202Z self_outputs = self.self( 2025-09-07T07:58:50.7982454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7982557Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7982916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7983098Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7983101Z 2025-09-07T07:58:50.7983229Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7983342Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7983447Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7983779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7983852Z layer_outputs = layer_module( 2025-09-07T07:58:50.7984064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7984139Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7984416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7984487Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7984760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7984827Z self_outputs = self.self( 2025-09-07T07:58:50.7985098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7985198Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7985529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7985714Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7985717Z 2025-09-07T07:58:50.7985813Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7986154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7986225Z layer_outputs = layer_module( 2025-09-07T07:58:50.7986453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7986529Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7986797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7986879Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7987141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7987214Z self_outputs = self.self( 2025-09-07T07:58:50.7987481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7987576Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7987909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7988082Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7988085Z 2025-09-07T07:58:50.7988191Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7988531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7988606Z layer_outputs = layer_module( 2025-09-07T07:58:50.7988817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7988894Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7989169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7989269Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7989572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7989637Z self_outputs = self.self( 2025-09-07T07:58:50.7989902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.7990006Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.7990325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.7990501Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.7990504Z 2025-09-07T07:58:50.7990578Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7990656Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7990756Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7991087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7991166Z layer_outputs = layer_module( 2025-09-07T07:58:50.7991373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7991455Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7991718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7991788Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7992061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7992127Z self_outputs = self.self( 2025-09-07T07:58:50.7992402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.7992473Z attn_scores += diagonal_mask 2025-09-07T07:58:50.7992476Z 2025-09-07T07:58:50.7992580Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7992909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7992976Z layer_outputs = layer_module( 2025-09-07T07:58:50.7993192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7993266Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7993539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7993607Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7993885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7993949Z self_outputs = self.self( 2025-09-07T07:58:50.7994210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.7994293Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.7994295Z 2025-09-07T07:58:50.7994368Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7994458Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.7994552Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7994874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7994947Z layer_outputs = layer_module( 2025-09-07T07:58:50.7995186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7995292Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7995548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7995616Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7995879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7995940Z self_outputs = self.self( 2025-09-07T07:58:50.7996200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7996307Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7996637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7996799Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.7996977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7997077Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7997079Z 2025-09-07T07:58:50.7997173Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.7997500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.7997565Z layer_outputs = layer_module( 2025-09-07T07:58:50.7997773Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.7997847Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.7998103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.7998183Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.7998439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.7998512Z self_outputs = self.self( 2025-09-07T07:58:50.7998765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.7998872Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.7999204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.7999327Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.7999624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.7999711Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.7999894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.7999983Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.7999986Z 2025-09-07T07:58:50.8000081Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8000408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8000473Z layer_outputs = layer_module( 2025-09-07T07:58:50.8000683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8000754Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8001047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8001146Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8001404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8001478Z self_outputs = self.self( 2025-09-07T07:58:50.8001732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8001845Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8002170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8002306Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8002323Z 2025-09-07T07:58:50.8002419Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8002744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8002818Z layer_outputs = layer_module( 2025-09-07T07:58:50.8003020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8003100Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8003356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8003427Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8003687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8003752Z self_outputs = self.self( 2025-09-07T07:58:50.8004014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8004117Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8004438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8004583Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8004586Z 2025-09-07T07:58:50.8004681Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8005005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8005072Z layer_outputs = layer_module( 2025-09-07T07:58:50.8005280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8005357Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8005613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8005688Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8005943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8006014Z self_outputs = self.self( 2025-09-07T07:58:50.8006270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.8006447Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.8006450Z 2025-09-07T07:58:50.8006524Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8006625Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8006729Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8006799Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8006874Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8006940Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8007034Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8007362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8007428Z layer_outputs = layer_module( 2025-09-07T07:58:50.8007639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8007713Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8007971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8008049Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8008302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8008370Z self_outputs = self.self( 2025-09-07T07:58:50.8008629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.8008714Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.8008717Z 2025-09-07T07:58:50.8008787Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8008854Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8008955Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8009273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8009345Z layer_outputs = layer_module( 2025-09-07T07:58:50.8009546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8009619Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8009888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8009953Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8010217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8010281Z self_outputs = self.self( 2025-09-07T07:58:50.8010533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8010634Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8010944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8011120Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8011123Z 2025-09-07T07:58:50.8011191Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8011271Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8011365Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8011686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8011757Z layer_outputs = layer_module( 2025-09-07T07:58:50.8011954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8012033Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8012331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8012433Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8012690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8012751Z self_outputs = self.self( 2025-09-07T07:58:50.8013013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8013104Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8013419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8013584Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8013587Z 2025-09-07T07:58:50.8013686Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8014012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8014076Z layer_outputs = layer_module( 2025-09-07T07:58:50.8014280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8014353Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8014610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8014678Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8014931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8015004Z self_outputs = self.self( 2025-09-07T07:58:50.8015263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8015361Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8015668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8015839Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8015842Z 2025-09-07T07:58:50.8015936Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8016252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8016325Z layer_outputs = layer_module( 2025-09-07T07:58:50.8016530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8016612Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8016865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8016934Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8017194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8017256Z self_outputs = self.self( 2025-09-07T07:58:50.8017511Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8017601Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8017913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8018140Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8018144Z 2025-09-07T07:58:50.8018219Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8018296Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8018390Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8018717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8018781Z layer_outputs = layer_module( 2025-09-07T07:58:50.8018982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8019060Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8019317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8019398Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8019652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8019722Z self_outputs = self.self( 2025-09-07T07:58:50.8019976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.8020043Z attn_scores += diagonal_mask 2025-09-07T07:58:50.8020046Z 2025-09-07T07:58:50.8020148Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8020471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8020541Z layer_outputs = layer_module( 2025-09-07T07:58:50.8020746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8020826Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8021081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8021150Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8021413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8021477Z self_outputs = self.self( 2025-09-07T07:58:50.8021738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.8021809Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.8021812Z 2025-09-07T07:58:50.8021882Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8021958Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8022050Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8022386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8022450Z layer_outputs = layer_module( 2025-09-07T07:58:50.8022649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8022727Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8022982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8023055Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8023309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8023378Z self_outputs = self.self( 2025-09-07T07:58:50.8023659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8023792Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8024124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8024281Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.8024465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.8024555Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.8024558Z 2025-09-07T07:58:50.8024656Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8024979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8025052Z layer_outputs = layer_module( 2025-09-07T07:58:50.8025264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8025336Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8025599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8025666Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8025919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8025989Z self_outputs = self.self( 2025-09-07T07:58:50.8026240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8026353Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8026677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8026811Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.8027101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.8027184Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.8027368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.8027458Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.8027461Z 2025-09-07T07:58:50.8027561Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8027883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8027958Z layer_outputs = layer_module( 2025-09-07T07:58:50.8028160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8028231Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8028495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8028565Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8028826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8028889Z self_outputs = self.self( 2025-09-07T07:58:50.8029144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8029260Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8029636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8029785Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8029788Z 2025-09-07T07:58:50.8029883Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8030208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8030272Z layer_outputs = layer_module( 2025-09-07T07:58:50.8030476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8030559Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8030818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8030898Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8031156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8031220Z self_outputs = self.self( 2025-09-07T07:58:50.8031485Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8031591Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8031919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8032054Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8032058Z 2025-09-07T07:58:50.8032158Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8032487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8032551Z layer_outputs = layer_module( 2025-09-07T07:58:50.8032758Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8032830Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8033095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8033162Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8033427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8033489Z self_outputs = self.self( 2025-09-07T07:58:50.8033749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.8033934Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.8033937Z 2025-09-07T07:58:50.8034008Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034086Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034156Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034224Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034300Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034368Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8034471Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8034799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8034865Z layer_outputs = layer_module( 2025-09-07T07:58:50.8035104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8035218Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8035483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8035552Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8035816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8035880Z self_outputs = self.self( 2025-09-07T07:58:50.8036137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T07:58:50.8036223Z query_vectors = self.query(hidden_states) 2025-09-07T07:58:50.8036226Z 2025-09-07T07:58:50.8036294Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8036371Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8036470Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8036795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8036866Z layer_outputs = layer_module( 2025-09-07T07:58:50.8037069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8037146Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8037405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8037471Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8037736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8037804Z self_outputs = self.self( 2025-09-07T07:58:50.8038071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8038169Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8038487Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8038657Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8038661Z 2025-09-07T07:58:50.8038732Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8038805Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8038897Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8039223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8039293Z layer_outputs = layer_module( 2025-09-07T07:58:50.8039504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8039574Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8039833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8039909Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8040166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8040235Z self_outputs = self.self( 2025-09-07T07:58:50.8040493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8040587Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8040933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8041130Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8041133Z 2025-09-07T07:58:50.8041232Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8041553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8041625Z layer_outputs = layer_module( 2025-09-07T07:58:50.8041829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8041901Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8042168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8042238Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8042500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8042564Z self_outputs = self.self( 2025-09-07T07:58:50.8042816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8042915Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8043223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8043394Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8043397Z 2025-09-07T07:58:50.8043490Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8043821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8043889Z layer_outputs = layer_module( 2025-09-07T07:58:50.8044090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8044169Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8044426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8044503Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8044759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8044830Z self_outputs = self.self( 2025-09-07T07:58:50.8045083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T07:58:50.8045178Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T07:58:50.8045496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T07:58:50.8045660Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T07:58:50.8045663Z 2025-09-07T07:58:50.8045741Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8045814Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8045908Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8046237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8046303Z layer_outputs = layer_module( 2025-09-07T07:58:50.8046541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8046641Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8046908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8046978Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8047232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8047304Z self_outputs = self.self( 2025-09-07T07:58:50.8047561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T07:58:50.8047634Z attn_scores += diagonal_mask 2025-09-07T07:58:50.8047637Z 2025-09-07T07:58:50.8047731Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8048063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8048129Z layer_outputs = layer_module( 2025-09-07T07:58:50.8048330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8048410Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8048661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8048735Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8048988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8049052Z self_outputs = self.self( 2025-09-07T07:58:50.8049315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T07:58:50.8049391Z attn_probs = nn.functional.softmax( 2025-09-07T07:58:50.8049394Z 2025-09-07T07:58:50.8049470Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8049538Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8049631Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8049958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8050021Z layer_outputs = layer_module( 2025-09-07T07:58:50.8050228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8050301Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8050565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8050635Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8050893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8050963Z self_outputs = self.self( 2025-09-07T07:58:50.8051214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8051324Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8051644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8051801Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T07:58:50.8051984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.8052103Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.8052132Z 2025-09-07T07:58:50.8052236Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8052557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8052628Z layer_outputs = layer_module( 2025-09-07T07:58:50.8052829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8052901Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8053163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8053231Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8053499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8053565Z self_outputs = self.self( 2025-09-07T07:58:50.8053829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8053937Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8054258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8054390Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T07:58:50.8054677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T07:58:50.8054766Z chunked_hidden_states = nn.functional.pad( 2025-09-07T07:58:50.8054943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T07:58:50.8055037Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T07:58:50.8055047Z 2025-09-07T07:58:50.8055141Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8055459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8055533Z layer_outputs = layer_module( 2025-09-07T07:58:50.8055732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8055811Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8056069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8056139Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8056402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8056472Z self_outputs = self.self( 2025-09-07T07:58:50.8056735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8056842Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8057166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8057305Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8057308Z 2025-09-07T07:58:50.8057402Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8057729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8057793Z layer_outputs = layer_module( 2025-09-07T07:58:50.8058025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8058135Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8058391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8058464Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8058719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8058790Z self_outputs = self.self( 2025-09-07T07:58:50.8059042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T07:58:50.8059152Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T07:58:50.8059476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T07:58:50.8059616Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T07:58:50.8059620Z 2025-09-07T07:58:50.8059719Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:58:50.8060042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T07:58:50.8060120Z layer_outputs = layer_module( 2025-09-07T07:58:50.8060322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:58:50.8060400Z return super().__call__(*args, **kwargs) 2025-09-07T07:58:50.8060654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T07:58:50.8060723Z self_attn_outputs = self.attention( 2025-09-07T07:58:50.8060990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T07:58:50.8061054Z self_outputs = self.self( 2025-09-07T07:58:50.8061316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T07:58:50.8061488Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T07:58:50.8061491Z 2025-09-07T07:58:50.8061564Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8061649Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8061720Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8061797Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8061865Z cudagraph partition due to non gpu ops 2025-09-07T07:58:50.8061931Z cudagraph partition due to non gpu ops 2025-09-07T07:59:05.0339493Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:05.0340212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1716, in torch_dynamo_resume_in_forward_at_1703 2025-09-07T07:59:05.0340795Z prediction_scores = self.lm_head(sequence_output) 2025-09-07T07:59:05.0341226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1333, in forward 2025-09-07T07:59:05.0341623Z x = self.dense(features) 2025-09-07T07:59:05.0341738Z 2025-09-07T07:59:05.0341830Z cudagraph partition due to non gpu ops 2025-09-07T07:59:05.0342035Z cudagraph partition due to non gpu ops 2025-09-07T07:59:05.0342222Z cudagraph partition due to non gpu ops 2025-09-07T07:59:05.0342416Z cudagraph partition due to non gpu ops 2025-09-07T07:59:05.0342643Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:05.0343515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1723, in torch_dynamo_resume_in_forward_at_1703 2025-09-07T07:59:05.0344186Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T07:59:05.0344416Z 2025-09-07T07:59:05.4751784Z pass 2025-09-07T07:59:05.4752154Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:08.1625823Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:59:08.1626818Z import pynvml # type: ignore[import] 2025-09-07T07:59:10.3910825Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T07:59:10.3911675Z from pkg_resources import resource_filename 2025-09-07T07:59:10.9302440Z 2025-09-07T07:59:13.3137466Z loading model: 0it [00:00, ?it/s] 2025-09-07T07:59:13.3140694Z loading model: 0it [00:02, ?it/s] 2025-09-07T07:59:13.3141107Z cpu eval BartForCausalLM 2025-09-07T07:59:13.9052350Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:14.0676287Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:14.2198842Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:22.9182623Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9182927Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9183217Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9183534Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9185083Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9185328Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9185527Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9185724Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9185911Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9186108Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9186304Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9186494Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9186676Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9186873Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9187064Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9187257Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9187450Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9187637Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9187836Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9188042Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9188245Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9188434Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9188666Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9189030Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9189349Z return mod(**inputs) 2025-09-07T07:59:22.9189715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9190093Z outputs = self.model.decoder( 2025-09-07T07:59:22.9190465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9190830Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9191470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9191922Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9192289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9192677Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9193058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9193431Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9193848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9194295Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9194470Z 2025-09-07T07:59:22.9194579Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9194921Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9195219Z return mod(**inputs) 2025-09-07T07:59:22.9195555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9195913Z outputs = self.model.decoder( 2025-09-07T07:59:22.9196272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9196625Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9196939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9197277Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9197634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9198014Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9198383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9198766Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9199185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9199618Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9199770Z 2025-09-07T07:59:22.9199852Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9200041Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9200237Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9200511Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9200703Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9200887Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9201079Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9201274Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9201464Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9201654Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9201836Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9202029Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9202222Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9202414Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9202596Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9202788Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9203008Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9203346Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9203641Z return mod(**inputs) 2025-09-07T07:59:22.9203982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9204436Z outputs = self.model.decoder( 2025-09-07T07:59:22.9204795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9205143Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9205473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9205806Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9206163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9206540Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9206905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9207280Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9207695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9208146Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9208314Z 2025-09-07T07:59:22.9208418Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9208745Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9209054Z return mod(**inputs) 2025-09-07T07:59:22.9209386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9209742Z outputs = self.model.decoder( 2025-09-07T07:59:22.9210081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9210432Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9210755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9211093Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9211446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9211811Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9212180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9212553Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9212957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9213380Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9213529Z 2025-09-07T07:59:22.9213603Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9213799Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9213999Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9214213Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9214406Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9214602Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9214785Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9214975Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9215165Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9215357Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9215539Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9215729Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9215926Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9216118Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9216301Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9216490Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9216742Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9217997Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9218295Z return mod(**inputs) 2025-09-07T07:59:22.9218632Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9218987Z outputs = self.model.decoder( 2025-09-07T07:59:22.9219341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9219694Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9220010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9220347Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9220702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9221082Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9221454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9221817Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9222222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9222660Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9222826Z 2025-09-07T07:59:22.9222931Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9223264Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9223559Z return mod(**inputs) 2025-09-07T07:59:22.9223893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9224250Z outputs = self.model.decoder( 2025-09-07T07:59:22.9224603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9224948Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9225269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9225604Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9225956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9226333Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9226699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9227070Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9227480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9227909Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9228057Z 2025-09-07T07:59:22.9228138Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9228331Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9228528Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9228725Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9228918Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9229101Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9229293Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9229485Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9229674Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9229857Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9230049Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9230238Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9230520Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9230707Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9230898Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9231092Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9231310Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9231639Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9231943Z return mod(**inputs) 2025-09-07T07:59:22.9232278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9232632Z outputs = self.model.decoder( 2025-09-07T07:59:22.9232974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9233326Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9233652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9233992Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9234350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9234713Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9235080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9235450Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9235858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9236303Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9236467Z 2025-09-07T07:59:22.9236561Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9236902Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9237202Z return mod(**inputs) 2025-09-07T07:59:22.9237534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9237892Z outputs = self.model.decoder( 2025-09-07T07:59:22.9238232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9238579Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9238899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9239236Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9239583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9239961Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9240339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9240713Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9241120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9241539Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9241696Z 2025-09-07T07:59:22.9241773Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9241972Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9242169Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9242353Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9242543Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9242735Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9242955Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9243167Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9243356Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9243549Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9243744Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9243926Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9244118Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9244311Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9244501Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9244693Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9244903Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9245237Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9245538Z return mod(**inputs) 2025-09-07T07:59:22.9245879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9246231Z outputs = self.model.decoder( 2025-09-07T07:59:22.9246581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9246935Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9247256Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9247587Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9247933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9248305Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9248675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9249047Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9249450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9249892Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9250065Z 2025-09-07T07:59:22.9250163Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9250500Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9250802Z return mod(**inputs) 2025-09-07T07:59:22.9251129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9251484Z outputs = self.model.decoder( 2025-09-07T07:59:22.9251830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9252181Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9252505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9252834Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9253189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9253561Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9253931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9254295Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9254705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9255124Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9255271Z 2025-09-07T07:59:22.9255352Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9255610Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9255830Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256022Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256216Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256405Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256586Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256778Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9256970Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9257160Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9257343Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9257531Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9257719Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9257908Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9258092Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9258281Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9258501Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9258839Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9259135Z return mod(**inputs) 2025-09-07T07:59:22.9259470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9259830Z outputs = self.model.decoder( 2025-09-07T07:59:22.9260181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9260532Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9260846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9261185Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9261543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9261928Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9262305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9262672Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9263077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9263521Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9263689Z 2025-09-07T07:59:22.9263797Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9264133Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9264427Z return mod(**inputs) 2025-09-07T07:59:22.9264761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9265126Z outputs = self.model.decoder( 2025-09-07T07:59:22.9265474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9265818Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9266141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9266475Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9266828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9267203Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9267565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9267937Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9268380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9268832Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9268980Z 2025-09-07T07:59:22.9269060Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9269250Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9269461Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9269656Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9269850Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270033Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270223Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270414Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270606Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270788Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9270979Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9271171Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9271367Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9271546Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9271736Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9271931Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9272147Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9272475Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9272777Z return mod(**inputs) 2025-09-07T07:59:22.9273112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9273464Z outputs = self.model.decoder( 2025-09-07T07:59:22.9273804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9274155Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9274478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9274813Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9275162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9275527Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9275895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9276265Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9276671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9277111Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9277276Z 2025-09-07T07:59:22.9277377Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9277710Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9278012Z return mod(**inputs) 2025-09-07T07:59:22.9278343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9278696Z outputs = self.model.decoder( 2025-09-07T07:59:22.9279034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9279387Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9279710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9280047Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9280392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9280834Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9281342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9281735Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9282161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9282593Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9282793Z 2025-09-07T07:59:22.9282868Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9283078Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9283286Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9283478Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9283681Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9283889Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9284089Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9284285Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9284481Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9284679Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9284877Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9285073Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9285262Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9285460Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9285658Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9285854Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9286069Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9286414Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9286726Z return mod(**inputs) 2025-09-07T07:59:22.9287079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9287441Z outputs = self.model.decoder( 2025-09-07T07:59:22.9287805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9288165Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9288497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9288844Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9289202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9289592Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9289976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9290360Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9290782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9291236Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9291414Z 2025-09-07T07:59:22.9291512Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9291857Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9292165Z return mod(**inputs) 2025-09-07T07:59:22.9292503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9292872Z outputs = self.model.decoder( 2025-09-07T07:59:22.9293232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9293595Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9293997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9294383Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9294754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9295153Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9295548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9295929Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9296366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9296814Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9296982Z 2025-09-07T07:59:22.9297060Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9297278Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9297483Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9297691Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9297892Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9298094Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9298296Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9298494Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9298691Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9298888Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9299077Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9299274Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9299473Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9299670Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9299860Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9300059Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9300288Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9300634Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9300943Z return mod(**inputs) 2025-09-07T07:59:22.9301277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9301645Z outputs = self.model.decoder( 2025-09-07T07:59:22.9302002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9302359Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9302679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9303021Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9303386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9303775Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9304154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9304525Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9304941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9305387Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9305558Z 2025-09-07T07:59:22.9305665Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9306001Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9306302Z return mod(**inputs) 2025-09-07T07:59:22.9306670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9307060Z outputs = self.model.decoder( 2025-09-07T07:59:22.9307406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9307752Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9308081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9308417Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9308774Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9309154Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9309518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9309892Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9310305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9310735Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9310883Z 2025-09-07T07:59:22.9310968Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9311160Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9311359Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9311552Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9311745Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9311930Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9312120Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9312313Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9312545Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9312740Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9312932Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9313118Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9313316Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9313508Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9313699Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9313887Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9314114Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9314459Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9314766Z return mod(**inputs) 2025-09-07T07:59:22.9315100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9315466Z outputs = self.model.decoder( 2025-09-07T07:59:22.9315817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9316174Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9316503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9316835Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9317196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9317574Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9317943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9318317Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9318720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9319164Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9319343Z 2025-09-07T07:59:22.9319473Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9319833Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9320133Z return mod(**inputs) 2025-09-07T07:59:22.9320456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9320810Z outputs = self.model.decoder( 2025-09-07T07:59:22.9321158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9321508Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9321824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9322162Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9322517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9322902Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9323271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9323631Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9324033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9324454Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9324601Z 2025-09-07T07:59:22.9324685Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9324882Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9325078Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9325273Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9325464Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9325654Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9325841Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326030Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326219Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326412Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326596Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326785Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9326979Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9327166Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9327347Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9327537Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9327757Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9328088Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9328384Z return mod(**inputs) 2025-09-07T07:59:22.9328722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9329080Z outputs = self.model.decoder( 2025-09-07T07:59:22.9329423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9329772Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9330088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9330420Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9330772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9331145Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9331507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9331927Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9332365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9332807Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9332975Z 2025-09-07T07:59:22.9333077Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9333404Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9333707Z return mod(**inputs) 2025-09-07T07:59:22.9334041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9334397Z outputs = self.model.decoder( 2025-09-07T07:59:22.9334745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9335090Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9335414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9335753Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9336105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9336471Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9336846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9337215Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9337624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9338065Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9338215Z 2025-09-07T07:59:22.9338290Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9338500Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9338701Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9338903Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9339091Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9339287Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9339482Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9339675Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9339865Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9340063Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9340260Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9340455Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9340644Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9340841Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9341037Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9341234Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9341464Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9341803Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9342112Z return mod(**inputs) 2025-09-07T07:59:22.9342461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9342829Z outputs = self.model.decoder( 2025-09-07T07:59:22.9343176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9343540Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9343871Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9344216Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9344611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9345020Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9345405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9345787Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9346204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T07:59:22.9346658Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T07:59:22.9346832Z 2025-09-07T07:59:22.9346931Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9347273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9347584Z return mod(**inputs) 2025-09-07T07:59:22.9347933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1901, in forward 2025-09-07T07:59:22.9348296Z outputs = self.model.decoder( 2025-09-07T07:59:22.9348654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T07:59:22.9349014Z layer_outputs = decoder_layer( 2025-09-07T07:59:22.9349347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T07:59:22.9349699Z return super().__call__(*args, **kwargs) 2025-09-07T07:59:22.9350057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T07:59:22.9350448Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T07:59:22.9350836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T07:59:22.9351221Z attn_output, attn_weights = attention_interface( 2025-09-07T07:59:22.9351643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T07:59:22.9352072Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T07:59:22.9352232Z 2025-09-07T07:59:22.9352307Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9352515Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9352719Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9352914Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9353112Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9353311Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9353509Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9353700Z cudagraph partition due to non gpu ops 2025-09-07T07:59:22.9353925Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T07:59:22.9354273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T07:59:22.9354587Z return mod(**inputs) 2025-09-07T07:59:22.9354924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1923, in forward 2025-09-07T07:59:22.9355358Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T07:59:22.9355549Z 2025-09-07T07:59:33.1572624Z pass 2025-09-07T07:59:33.1573118Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:35.5323141Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T07:59:35.5324171Z import pynvml # type: ignore[import] 2025-09-07T07:59:37.7648158Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T07:59:37.7649255Z from pkg_resources import resource_filename 2025-09-07T07:59:38.4894412Z 2025-09-07T07:59:42.8543079Z loading model: 0it [00:00, ?it/s] 2025-09-07T07:59:42.8543437Z loading model: 0it [00:04, ?it/s] 2025-09-07T07:59:42.8543782Z cpu eval BartForConditionalGeneration 2025-09-07T07:59:44.0953531Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:44.4158710Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T07:59:44.7340475Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:01.4262173Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4262514Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4262729Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4262933Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4263131Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4263324Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4263511Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4263703Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4263929Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4264127Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4264322Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4264512Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4264706Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4264901Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4265100Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4265284Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4265487Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4265693Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4265885Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4266090Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4266297Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4266492Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4266725Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4267089Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4267427Z return mod(**inputs) 2025-09-07T08:00:01.4267811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4268189Z outputs = self.model( 2025-09-07T08:00:01.4268547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4268916Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4269293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4269660Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4270003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4270369Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4270738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4271128Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4271539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4271931Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4272696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4273286Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4273470Z 2025-09-07T08:00:01.4273576Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4273927Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4274243Z return mod(**inputs) 2025-09-07T08:00:01.4274593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4274956Z outputs = self.model( 2025-09-07T08:00:01.4275303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4275678Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4276041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4276417Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4276749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4277103Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4277470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4277852Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4278231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4278610Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4279036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4279485Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4279644Z 2025-09-07T08:00:01.4279729Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4279935Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4280124Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4280322Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4280518Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4280708Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4280896Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4281176Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4281371Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4281566Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4281754Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4281954Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4282161Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4282357Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4282550Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4282752Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4282980Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4283332Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4283643Z return mod(**inputs) 2025-09-07T08:00:01.4283989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4284351Z outputs = self.model( 2025-09-07T08:00:01.4284692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4285064Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4285416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4285776Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4286302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4286680Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4287070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4287483Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4287885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4288301Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4288748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4289203Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4289401Z 2025-09-07T08:00:01.4289504Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4289853Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4290165Z return mod(**inputs) 2025-09-07T08:00:01.4290506Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4290864Z outputs = self.model( 2025-09-07T08:00:01.4291207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4291569Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4291919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4292268Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4292596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4292940Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4293305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4293689Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4294057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4294439Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4294854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4295291Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4295443Z 2025-09-07T08:00:01.4295527Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4295721Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4295920Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4296120Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4296320Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4296510Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4296834Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4297031Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4297226Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4297415Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4297612Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4297808Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4298000Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4298192Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4298391Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4298591Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4298817Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4299197Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4299530Z return mod(**inputs) 2025-09-07T08:00:01.4299876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4300232Z outputs = self.model( 2025-09-07T08:00:01.4300573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4300929Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4301288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4301650Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4301992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4302338Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4302702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4303087Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4303460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4303846Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4304266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4304723Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4304904Z 2025-09-07T08:00:01.4305003Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4305348Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4305656Z return mod(**inputs) 2025-09-07T08:00:01.4305992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4306355Z outputs = self.model( 2025-09-07T08:00:01.4306692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4307058Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4307412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4307774Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4308107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4308446Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4308813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4309184Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4309562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4309944Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4310362Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4310795Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4310946Z 2025-09-07T08:00:01.4311022Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4311227Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4311430Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4311629Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4311813Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4312012Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4312214Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4312486Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4312690Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4312881Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4313077Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4313276Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4313473Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4313662Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4313865Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4314065Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4314297Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4314637Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4314961Z return mod(**inputs) 2025-09-07T08:00:01.4315314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4315686Z outputs = self.model( 2025-09-07T08:00:01.4316025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4316390Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4316745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4317107Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4317441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4317792Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4318160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4318539Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4318917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4319306Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4319729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4320181Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4320357Z 2025-09-07T08:00:01.4320455Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4320808Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4321112Z return mod(**inputs) 2025-09-07T08:00:01.4321453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4321812Z outputs = self.model( 2025-09-07T08:00:01.4322152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4322519Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4322873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4323234Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4323569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4323915Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4324284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4324660Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4325042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4325426Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4325883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4327677Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4327842Z 2025-09-07T08:00:01.4327919Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4328144Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4328351Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4328550Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4328742Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4328942Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4329138Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4329334Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4329520Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4329713Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4329908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4330111Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4330299Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4330497Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4330696Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4330890Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4331104Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4331451Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4331763Z return mod(**inputs) 2025-09-07T08:00:01.4332108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4332460Z outputs = self.model( 2025-09-07T08:00:01.4332803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4333170Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4333539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4333907Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4334235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4334591Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4334957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4335335Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4335718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4336095Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4336520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4336980Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4337154Z 2025-09-07T08:00:01.4337262Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4337603Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4337911Z return mod(**inputs) 2025-09-07T08:00:01.4338254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4338622Z outputs = self.model( 2025-09-07T08:00:01.4338965Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4339321Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4339679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4340118Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4340455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4340813Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4341189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4341582Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4341969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4342359Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4342780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4343223Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4343391Z 2025-09-07T08:00:01.4343477Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4343690Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4343897Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4344097Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4344302Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4344505Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4344709Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4344904Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4345105Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4345309Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4345509Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4345703Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4345908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4346114Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4346321Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4346521Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4346751Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4347105Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4347424Z return mod(**inputs) 2025-09-07T08:00:01.4347772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4348148Z outputs = self.model( 2025-09-07T08:00:01.4348497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4348871Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4349237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4349605Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4349950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4350305Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4350677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4351062Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4351438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4351837Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4352263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4352733Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4352912Z 2025-09-07T08:00:01.4353024Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4353428Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4353740Z return mod(**inputs) 2025-09-07T08:00:01.4354083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4354447Z outputs = self.model( 2025-09-07T08:00:01.4354780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4355145Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4355500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4355860Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4356193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4356531Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4356895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4357275Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4357644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4358025Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4358436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4358869Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4359028Z 2025-09-07T08:00:01.4359105Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4359310Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4359503Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4359700Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4359903Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4360100Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4360290Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4360487Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4360678Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4360869Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4361055Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4361251Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4361444Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4361640Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4361828Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4362022Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4362243Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4362588Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4362896Z return mod(**inputs) 2025-09-07T08:00:01.4363235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4363589Z outputs = self.model( 2025-09-07T08:00:01.4363930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4364295Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4364644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4365003Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4365338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4365689Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4366104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4366532Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4366912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4367294Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4367713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4368168Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4368341Z 2025-09-07T08:00:01.4368442Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4368788Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4369103Z return mod(**inputs) 2025-09-07T08:00:01.4369447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4369802Z outputs = self.model( 2025-09-07T08:00:01.4370149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4370516Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4370876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4371237Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4371563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4371909Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4372277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4372659Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4373033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4373417Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4373837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4374272Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4374426Z 2025-09-07T08:00:01.4374512Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4374708Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4374908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4375103Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4375300Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4375487Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4375682Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4375883Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4376085Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4376269Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4376466Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4376661Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4376860Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4377047Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4377241Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4377438Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4377663Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4378007Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4378309Z return mod(**inputs) 2025-09-07T08:00:01.4378660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4379051Z outputs = self.model( 2025-09-07T08:00:01.4379428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4379787Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4380147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4380512Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4380847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4381249Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4381614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4382006Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4382396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4382799Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4383230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4383687Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4383872Z 2025-09-07T08:00:01.4383976Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4384335Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4384653Z return mod(**inputs) 2025-09-07T08:00:01.4384989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4385353Z outputs = self.model( 2025-09-07T08:00:01.4385694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4386067Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4386425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4386781Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4387124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4387476Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4387842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4388225Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4388595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4388984Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4389412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4389860Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4390013Z 2025-09-07T08:00:01.4390087Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4390293Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4390493Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4390692Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4390881Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4391076Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4391272Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4391466Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4391659Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4391849Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4392044Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4392305Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4392550Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4392737Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4392932Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4393128Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4393352Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4393689Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4394002Z return mod(**inputs) 2025-09-07T08:00:01.4394345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4394710Z outputs = self.model( 2025-09-07T08:00:01.4395043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4395407Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4395768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4396131Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4396466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4396802Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4397166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4397544Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4397918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4398302Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4398714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4399174Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4399351Z 2025-09-07T08:00:01.4399448Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4399790Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4400093Z return mod(**inputs) 2025-09-07T08:00:01.4400433Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4400792Z outputs = self.model( 2025-09-07T08:00:01.4401131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4401493Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4401843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4402208Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4402545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4402891Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4403248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4403623Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4403994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4404373Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4404791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4405218Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4405376Z 2025-09-07T08:00:01.4405482Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4405712Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4405918Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4406115Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4406303Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4406500Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4406695Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4406889Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4407080Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4407278Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4407478Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4407674Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4407865Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4408063Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4408262Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4408460Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4408679Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4409028Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4409340Z return mod(**inputs) 2025-09-07T08:00:01.4409681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4410033Z outputs = self.model( 2025-09-07T08:00:01.4410378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4410746Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4411104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4411466Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4411796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4412147Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4412515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4412895Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4413267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4413655Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4414077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4414531Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4414701Z 2025-09-07T08:00:01.4414810Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4415153Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4415470Z return mod(**inputs) 2025-09-07T08:00:01.4415813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4416179Z outputs = self.model( 2025-09-07T08:00:01.4416522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4416875Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4417234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4417597Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4417931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4418271Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4418687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4419101Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4419478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4419864Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4420273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4420705Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4420867Z 2025-09-07T08:00:01.4420944Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4421145Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4421344Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4421534Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4421730Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4421930Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4422125Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4422312Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4422505Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4422700Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4422897Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4423082Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4423276Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4423472Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4423665Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4423854Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4424081Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4424429Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4424746Z return mod(**inputs) 2025-09-07T08:00:01.4425088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4425451Z outputs = self.model( 2025-09-07T08:00:01.4425790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4426150Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4426504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4426857Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4427191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4427538Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4427906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4428279Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4428649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4429028Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4429450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4429899Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4430071Z 2025-09-07T08:00:01.4430171Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4430513Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4430825Z return mod(**inputs) 2025-09-07T08:00:01.4431751Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4432163Z outputs = self.model( 2025-09-07T08:00:01.4432499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4432866Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4433223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4433590Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4433916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4434266Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4434627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4435005Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4435379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4435752Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4436168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4436605Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4436755Z 2025-09-07T08:00:01.4436835Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437038Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437230Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437425Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437621Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437811Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4437998Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4438195Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4438393Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4438594Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4438783Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4438980Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4439176Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4439370Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4439566Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4439763Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4439987Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4440329Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4440632Z return mod(**inputs) 2025-09-07T08:00:01.4440977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4441335Z outputs = self.model( 2025-09-07T08:00:01.4441678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4442046Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4442396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4442757Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4443091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4443436Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4443792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4444174Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4444550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4445000Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4445427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4445876Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4446052Z 2025-09-07T08:00:01.4446151Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4446494Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4446804Z return mod(**inputs) 2025-09-07T08:00:01.4447141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4447491Z outputs = self.model( 2025-09-07T08:00:01.4447835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1270, in forward 2025-09-07T08:00:01.4448196Z encoder_outputs = self.encoder( 2025-09-07T08:00:01.4448552Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 869, in forward 2025-09-07T08:00:01.4448906Z layer_outputs = encoder_layer( 2025-09-07T08:00:01.4449228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4449573Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4449937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 312, in forward 2025-09-07T08:00:01.4450314Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:00:01.4450683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4451067Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4451488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4451924Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4452073Z 2025-09-07T08:00:01.4452159Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4452352Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4452551Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4452753Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4452946Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4453130Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4453327Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4453519Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4453715Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4453900Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4454098Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4454291Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4454495Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4454681Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4454877Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4455070Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4455295Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4455629Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4455938Z return mod(**inputs) 2025-09-07T08:00:01.4456284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4456642Z outputs = self.model( 2025-09-07T08:00:01.4456985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4457342Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4457728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4458133Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4458464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4458798Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4459159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4459544Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4459936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4460322Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4460738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4461193Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4461376Z 2025-09-07T08:00:01.4461475Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4461816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4462124Z return mod(**inputs) 2025-09-07T08:00:01.4462456Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4462813Z outputs = self.model( 2025-09-07T08:00:01.4463147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4463509Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4463862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4464215Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4464553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4464899Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4465265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4465643Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4466025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4466410Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4466825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4467256Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4467404Z 2025-09-07T08:00:01.4467484Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4467689Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4467888Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4468090Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4468282Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4468480Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4468677Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4468874Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4469062Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4469257Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4469452Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4469647Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4469860Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4470210Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4470521Z return mod(**inputs) 2025-09-07T08:00:01.4470937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4471302Z outputs = self.model( 2025-09-07T08:00:01.4471640Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4472003Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4472366Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4472729Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4473055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4473404Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4473772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4474172Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4474566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4474944Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4475364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4475817Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4475988Z 2025-09-07T08:00:01.4476095Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4476440Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4476745Z return mod(**inputs) 2025-09-07T08:00:01.4477089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4477456Z outputs = self.model( 2025-09-07T08:00:01.4477796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4478160Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4478509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4478872Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4479202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4479551Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4479910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4480307Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4480706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4481134Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4481566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4482007Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4482172Z 2025-09-07T08:00:01.4482248Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4482455Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4482656Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4482853Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4483053Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4483254Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4483456Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4483647Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4483908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4484151Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4484344Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4484529Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4484727Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4484923Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4485127Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4485319Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4485531Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4485866Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4486168Z return mod(**inputs) 2025-09-07T08:00:01.4486505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4486848Z outputs = self.model( 2025-09-07T08:00:01.4487184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4487536Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4487879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4488228Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4488544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4488878Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4489230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4489604Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4489966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4490340Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4490743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4491185Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4491350Z 2025-09-07T08:00:01.4491454Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4491780Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4492084Z return mod(**inputs) 2025-09-07T08:00:01.4492415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4492763Z outputs = self.model( 2025-09-07T08:00:01.4493094Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4493442Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4493792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4494148Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4494473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4494799Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4495152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4495524Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4495897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4496272Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4496699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4497154Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4497313Z 2025-09-07T08:00:01.4497389Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4497601Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4497998Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4498181Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4498372Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4498562Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4498753Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4498939Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4499134Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4499327Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4499523Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4499705Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4499931Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4500267Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4500571Z return mod(**inputs) 2025-09-07T08:00:01.4500899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4501250Z outputs = self.model( 2025-09-07T08:00:01.4501585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4501938Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4502282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4502627Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4502956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4503294Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4503652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4504031Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4504415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4504791Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4505205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4505645Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4505812Z 2025-09-07T08:00:01.4505910Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4506246Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4506550Z return mod(**inputs) 2025-09-07T08:00:01.4506887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4507237Z outputs = self.model( 2025-09-07T08:00:01.4507561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4507916Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4508261Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4508616Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4508936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4509273Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4509662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4510093Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4510477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4510840Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4511247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4511670Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4511816Z 2025-09-07T08:00:01.4511901Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4512100Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4512289Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4512484Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4512682Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4512878Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4513062Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4513255Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4513448Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4513637Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4513821Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514014Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514207Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514400Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514586Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514779Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4514998Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4515330Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4515626Z return mod(**inputs) 2025-09-07T08:00:01.4515967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4516315Z outputs = self.model( 2025-09-07T08:00:01.4516648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4516999Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4517339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4517691Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4518016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4518350Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4518698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4519078Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4519450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4519825Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4520230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4520663Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4520840Z 2025-09-07T08:00:01.4520935Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4521267Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4521568Z return mod(**inputs) 2025-09-07T08:00:01.4521900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4522312Z outputs = self.model( 2025-09-07T08:00:01.4522651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4523012Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4523363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4523715Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4524048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4524391Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4524759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4525140Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4525515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4525901Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4526316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4526752Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4526905Z 2025-09-07T08:00:01.4526990Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4527191Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4527393Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4527594Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4527794Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4527982Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4528183Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4528383Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4528585Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4528782Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4528979Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4529179Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4529404Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4529739Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4530053Z return mod(**inputs) 2025-09-07T08:00:01.4530396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4530758Z outputs = self.model( 2025-09-07T08:00:01.4531101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4531456Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4531816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4532179Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4532508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4532841Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4533203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4533589Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4533981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4534359Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4534765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4535243Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4535446Z 2025-09-07T08:00:01.4535542Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4535872Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4536174Z return mod(**inputs) 2025-09-07T08:00:01.4536501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4536852Z outputs = self.model( 2025-09-07T08:00:01.4537185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4537537Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4537876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4538230Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4538562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4538903Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4539257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4539632Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4540016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4540391Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4540797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4541215Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4541361Z 2025-09-07T08:00:01.4541435Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4541638Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4541829Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542021Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542204Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542396Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542590Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542782Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4542965Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4543157Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4543348Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4543540Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4543722Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4543911Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4544100Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4544287Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4544506Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4544842Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4545145Z return mod(**inputs) 2025-09-07T08:00:01.4545481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4545831Z outputs = self.model( 2025-09-07T08:00:01.4546155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4546507Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4546852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4547207Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4547555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4547916Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4548271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4548648Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4549021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4549386Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4549796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4550238Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4550405Z 2025-09-07T08:00:01.4550508Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4550842Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4551141Z return mod(**inputs) 2025-09-07T08:00:01.4551473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4551823Z outputs = self.model( 2025-09-07T08:00:01.4552154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4552502Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4552855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4553212Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4553554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4553894Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4554240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4554618Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4554988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4555360Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4555770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4556186Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4556340Z 2025-09-07T08:00:01.4556415Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4556612Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4556806Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4556990Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4557186Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4557382Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4557572Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4557755Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4557947Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4558140Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4558330Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4558511Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4558729Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4559062Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4559365Z return mod(**inputs) 2025-09-07T08:00:01.4559702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4560042Z outputs = self.model( 2025-09-07T08:00:01.4560416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4560800Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4561150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4561497Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4561823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4562155Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4562509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4562892Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4563269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4563641Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4564052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4564496Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4564661Z 2025-09-07T08:00:01.4564767Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4565092Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4565397Z return mod(**inputs) 2025-09-07T08:00:01.4565728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4566079Z outputs = self.model( 2025-09-07T08:00:01.4566402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4566761Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4567112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4567465Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4567796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4568124Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4568481Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4568862Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4569244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4569613Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4570015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4570440Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4570594Z 2025-09-07T08:00:01.4570667Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4570871Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4571059Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4571251Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4571443Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4571640Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4571821Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4572011Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4572204Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4572396Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4572590Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4572775Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4573024Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4573247Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4573436Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4573621Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4573838Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4574169Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4574475Z return mod(**inputs) 2025-09-07T08:00:01.4574801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4575147Z outputs = self.model( 2025-09-07T08:00:01.4575479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4575831Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4576171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4576526Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4576850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4577186Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4577538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4577906Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4578280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4578654Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4579065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4579507Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4579677Z 2025-09-07T08:00:01.4579775Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4580108Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4580181Z return mod(**inputs) 2025-09-07T08:00:01.4580413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4580477Z outputs = self.model( 2025-09-07T08:00:01.4580717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4580790Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4581063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4581134Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4581349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4581436Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4581670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4581775Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4582007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4606517Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4606919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4607050Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4607055Z 2025-09-07T08:00:01.4607144Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607458Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607544Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607616Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607698Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607772Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607844Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607923Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4607992Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4608062Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4608142Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4608214Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4608328Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4608533Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4608601Z return mod(**inputs) 2025-09-07T08:00:01.4608876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4608947Z outputs = self.model( 2025-09-07T08:00:01.4609199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4609274Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4609520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4609593Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4609804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4609895Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4610132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4610255Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4610492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4610587Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4610872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4611002Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4611006Z 2025-09-07T08:00:01.4611115Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4611307Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4611380Z return mod(**inputs) 2025-09-07T08:00:01.4611618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4611692Z outputs = self.model( 2025-09-07T08:00:01.4611935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4612008Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4612247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4612316Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4612529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4612611Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4612850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4612953Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4613213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4613339Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4613605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4613717Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4613721Z 2025-09-07T08:00:01.4613795Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4613873Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4613943Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614013Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614090Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614159Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614228Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614306Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614376Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614455Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614524Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614593Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614672Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614742Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614818Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614886Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4614983Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4615178Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4615239Z return mod(**inputs) 2025-09-07T08:00:01.4615473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4615539Z outputs = self.model( 2025-09-07T08:00:01.4615770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4615849Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4616075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4616150Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4616352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4616437Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4616666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4616759Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4616991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4617085Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4617361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4617489Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4617493Z 2025-09-07T08:00:01.4617589Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4617784Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4617845Z return mod(**inputs) 2025-09-07T08:00:01.4618083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4618147Z outputs = self.model( 2025-09-07T08:00:01.4618372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4618487Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4618740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4618815Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4619014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4619099Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4619325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4619417Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4619653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4619741Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4620016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4620121Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4620124Z 2025-09-07T08:00:01.4620198Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620276Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620345Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620422Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620490Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620558Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620638Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620706Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620782Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620851Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620919Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4620996Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4621095Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4621288Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4621349Z return mod(**inputs) 2025-09-07T08:00:01.4621576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4621645Z outputs = self.model( 2025-09-07T08:00:01.4621873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4621948Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4622176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4622244Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4622457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4622537Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4622768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4622870Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4623100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4623187Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4623453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4623580Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4623583Z 2025-09-07T08:00:01.4623681Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4624220Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4624321Z return mod(**inputs) 2025-09-07T08:00:01.4624551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4624626Z outputs = self.model( 2025-09-07T08:00:01.4624854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4624931Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4625162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4625240Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4625442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4625519Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4625759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4625862Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4626097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4626185Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4626450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4626558Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4626561Z 2025-09-07T08:00:01.4626636Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4626713Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4626782Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4626850Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4626927Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4626999Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627078Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627147Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627216Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627294Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627362Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627437Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627505Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627571Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627645Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627713Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4627816Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4628003Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4628062Z return mod(**inputs) 2025-09-07T08:00:01.4628306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4628369Z outputs = self.model( 2025-09-07T08:00:01.4628605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4628672Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4628900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4628975Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4629181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4629264Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4629491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4629648Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4629872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4629963Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4630237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4630356Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4630360Z 2025-09-07T08:00:01.4630464Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4630648Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4630709Z return mod(**inputs) 2025-09-07T08:00:01.4630945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4631014Z outputs = self.model( 2025-09-07T08:00:01.4631248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4631317Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4631543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4631619Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4631822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4631904Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4632129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4632228Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4632458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4632549Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4632823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4632922Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4632925Z 2025-09-07T08:00:01.4633008Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633081Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633152Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633230Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633300Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633379Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633448Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633518Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633598Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633669Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633746Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633816Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4633913Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4634105Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4634165Z return mod(**inputs) 2025-09-07T08:00:01.4634399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4634461Z outputs = self.model( 2025-09-07T08:00:01.4634685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4634762Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4635017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4635125Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4635331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4635405Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4635639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4635739Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4635968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4636057Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4636331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4636454Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4636459Z 2025-09-07T08:00:01.4636556Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4636745Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4636807Z return mod(**inputs) 2025-09-07T08:00:01.4637042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4637103Z outputs = self.model( 2025-09-07T08:00:01.4637328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4637405Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4637633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4637707Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4637911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4637998Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4638223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4638322Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4638558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4638647Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4638913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4639012Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4639015Z 2025-09-07T08:00:01.4639086Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639169Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639238Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639315Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639384Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639453Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639530Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639598Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639673Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639740Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639808Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639882Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4639949Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4640026Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4640095Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4640163Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4640326Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4640510Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4640576Z return mod(**inputs) 2025-09-07T08:00:01.4640806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4640870Z outputs = self.model( 2025-09-07T08:00:01.4641106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4641174Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4641410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4641477Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4641681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4641766Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4641989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4642087Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4642312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4642402Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4642676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4642796Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4642799Z 2025-09-07T08:00:01.4642907Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4643091Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4643162Z return mod(**inputs) 2025-09-07T08:00:01.4643391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4643452Z outputs = self.model( 2025-09-07T08:00:01.4643688Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4643757Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4643994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4644058Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4644258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4644343Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4644568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4644670Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4644893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4644991Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4645254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4645355Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4645358Z 2025-09-07T08:00:01.4645440Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645513Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645589Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645660Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645765Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645875Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4645943Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646020Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646087Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646155Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646232Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646301Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4646403Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4646587Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4646647Z return mod(**inputs) 2025-09-07T08:00:01.4646881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4646945Z outputs = self.model( 2025-09-07T08:00:01.4647179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4647250Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4647475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4647550Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4647752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4647838Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4648063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4648164Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4648394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4648485Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4648756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4648875Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4648878Z 2025-09-07T08:00:01.4648983Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4649161Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4649221Z return mod(**inputs) 2025-09-07T08:00:01.4649453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4649516Z outputs = self.model( 2025-09-07T08:00:01.4649749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4649817Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4650048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4650121Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4650322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4650404Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4650627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4650734Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4650956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4651047Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4651349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4651496Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4651499Z 2025-09-07T08:00:01.4651581Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4651653Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4651723Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4651801Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4651867Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4651942Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652012Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652082Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652159Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652225Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652300Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652368Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652437Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652515Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652583Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652657Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4652754Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4652936Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4653004Z return mod(**inputs) 2025-09-07T08:00:01.4653234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4653305Z outputs = self.model( 2025-09-07T08:00:01.4653533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4653601Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4653844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4653916Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4654131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4654205Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4654430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4654530Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4654759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4654855Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4655122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4655250Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4655255Z 2025-09-07T08:00:01.4655351Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4655530Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4655600Z return mod(**inputs) 2025-09-07T08:00:01.4655827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4655897Z outputs = self.model( 2025-09-07T08:00:01.4656122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4656191Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4656427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4656492Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4656737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4656853Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4657080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4657179Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4657407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4657505Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4657769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4657878Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4657881Z 2025-09-07T08:00:01.4657953Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658030Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658109Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658178Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658255Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658322Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658390Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658467Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658536Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658612Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658680Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658748Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4658850Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4659033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4659100Z return mod(**inputs) 2025-09-07T08:00:01.4659328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4659394Z outputs = self.model( 2025-09-07T08:00:01.4659626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4659696Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4659931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4659996Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4660195Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4660279Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4660501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4660612Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4660840Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4660929Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4661202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4661320Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4661323Z 2025-09-07T08:00:01.4661427Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4661610Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4661678Z return mod(**inputs) 2025-09-07T08:00:01.4661901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4662027Z outputs = self.model( 2025-09-07T08:00:01.4662262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4662329Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4662561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4662626Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4662827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4662908Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4663134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4663240Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4663465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4663562Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4663826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4663923Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4663926Z 2025-09-07T08:00:01.4664005Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664077Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664156Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664225Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664293Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664369Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664437Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664512Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664583Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664652Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664728Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664797Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664865Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4664942Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4665009Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4665082Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4665176Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4665357Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4665425Z return mod(**inputs) 2025-09-07T08:00:01.4665652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4665725Z outputs = self.model( 2025-09-07T08:00:01.4665954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4666031Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4666258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4666325Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4666534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4666611Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4666842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4666931Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4667157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4667281Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4667583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4667710Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4667713Z 2025-09-07T08:00:01.4667810Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4668000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4668058Z return mod(**inputs) 2025-09-07T08:00:01.4668283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4668356Z outputs = self.model( 2025-09-07T08:00:01.4668581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4668658Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4668885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4668954Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4669165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4669240Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4669472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4669564Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4669790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4669886Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4670153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4670265Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4670268Z 2025-09-07T08:00:01.4670339Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670416Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670485Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670553Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670630Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670696Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670772Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670839Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4670986Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4671056Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4671124Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4671231Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4671414Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4671481Z return mod(**inputs) 2025-09-07T08:00:01.4671709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4671772Z outputs = self.model( 2025-09-07T08:00:01.4672007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4672073Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4672310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4672377Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4672585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4672720Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4672950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4673059Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4673287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4673383Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4673651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4673771Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4673780Z 2025-09-07T08:00:01.4673873Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4674057Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4674127Z return mod(**inputs) 2025-09-07T08:00:01.4674358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4674428Z outputs = self.model( 2025-09-07T08:00:01.4674656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4674725Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4674958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4675026Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4675240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4675314Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4675546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4675658Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4675885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4675982Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4676251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4676356Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4676359Z 2025-09-07T08:00:01.4676432Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676502Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676577Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676646Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676723Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676798Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676867Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4676943Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677013Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677081Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677156Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677225Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677302Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677370Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677438Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677515Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4677610Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4677801Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4677864Z return mod(**inputs) 2025-09-07T08:00:01.4678161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4678232Z outputs = self.model( 2025-09-07T08:00:01.4678461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4678538Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4678769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4678842Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4679041Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4679116Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4679347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4679442Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4679672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4679759Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4680024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4680152Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4680155Z 2025-09-07T08:00:01.4680249Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4680439Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4680501Z return mod(**inputs) 2025-09-07T08:00:01.4680734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4680812Z outputs = self.model( 2025-09-07T08:00:01.4681091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4681170Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4681409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4681486Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4681742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4681817Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4682055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4682146Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4682388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4682482Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4682760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4682876Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4682879Z 2025-09-07T08:00:01.4682954Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683037Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683109Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683181Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683262Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683333Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683412Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683483Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683611Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683735Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683806Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683885Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4683984Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4684178Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4684240Z return mod(**inputs) 2025-09-07T08:00:01.4684488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4684553Z outputs = self.model( 2025-09-07T08:00:01.4684801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4684873Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4685117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4685185Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4685397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4685484Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4685718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4685828Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4686065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4686156Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4686438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4686564Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4686569Z 2025-09-07T08:00:01.4686674Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4686863Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4686931Z return mod(**inputs) 2025-09-07T08:00:01.4687171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4687235Z outputs = self.model( 2025-09-07T08:00:01.4687480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4687551Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4687797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4687861Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4688072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4688160Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4688396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4688505Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4688740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4688829Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4689111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4689210Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4689213Z 2025-09-07T08:00:01.4689293Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689449Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689526Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689594Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689663Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689739Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689808Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689885Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4689952Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690021Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690095Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690163Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690232Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690310Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690381Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690457Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4690559Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4690747Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4690816Z return mod(**inputs) 2025-09-07T08:00:01.4691051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4691122Z outputs = self.model( 2025-09-07T08:00:01.4691357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4691426Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4691666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4691732Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4691943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4692023Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4692264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4692358Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4692589Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4692686Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4692962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4693089Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4693093Z 2025-09-07T08:00:01.4693188Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4693375Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4693444Z return mod(**inputs) 2025-09-07T08:00:01.4693682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4693754Z outputs = self.model( 2025-09-07T08:00:01.4693987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4694062Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4694298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4694364Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4694572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4694647Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4694902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 413, in forward 2025-09-07T08:00:01.4695022Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:00:01.4695247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4695340Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4695603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4695710Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4695713Z 2025-09-07T08:00:01.4695784Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4695862Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4695931Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696000Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696076Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696149Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696218Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696295Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696363Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696436Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696503Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696570Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4696673Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4696860Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4696928Z return mod(**inputs) 2025-09-07T08:00:01.4697157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4697218Z outputs = self.model( 2025-09-07T08:00:01.4697451Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4697520Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4697755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4697822Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4698029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4698106Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4698331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4698438Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4698661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4698757Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4699024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:00:01.4699143Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:01.4699146Z 2025-09-07T08:00:01.4699249Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4699431Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4699498Z return mod(**inputs) 2025-09-07T08:00:01.4699723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1471, in forward 2025-09-07T08:00:01.4699792Z outputs = self.model( 2025-09-07T08:00:01.4700019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1288, in forward 2025-09-07T08:00:01.4700084Z decoder_outputs = self.decoder( 2025-09-07T08:00:01.4700377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1115, in forward 2025-09-07T08:00:01.4700443Z layer_outputs = decoder_layer( 2025-09-07T08:00:01.4700650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:01.4700724Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:01.4700944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 430, in forward 2025-09-07T08:00:01.4701046Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:00:01.4701268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 254, in forward 2025-09-07T08:00:01.4701366Z attn_output, attn_weights = attention_interface( 2025-09-07T08:00:01.4701630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:00:01.4701728Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:00:01.4701737Z 2025-09-07T08:00:01.4701809Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4701875Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4701950Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702017Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702082Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702154Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702223Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702294Z cudagraph partition due to non gpu ops 2025-09-07T08:00:01.4702388Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:01.4702568Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:01.4702632Z return mod(**inputs) 2025-09-07T08:00:01.4702857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bart/modeling_bart.py", line 1497, in forward 2025-09-07T08:00:01.4703019Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:00:01.4703022Z 2025-09-07T08:00:13.8777042Z pass 2025-09-07T08:00:13.8777455Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:16.4849774Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:00:16.4850548Z import pynvml # type: ignore[import] 2025-09-07T08:00:18.7128137Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:00:18.7128989Z from pkg_resources import resource_filename 2025-09-07T08:00:19.3258726Z 2025-09-07T08:00:20.4222376Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:00:20.4222827Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:00:20.4223184Z cpu eval BertForMaskedLM 2025-09-07T08:00:20.6410599Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:20.7171413Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:20.7923899Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:29.5497181Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5497517Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5497753Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5498441Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5498674Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5498907Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5499119Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5499339Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5499563Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5499802Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5500027Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5500260Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5500475Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5500669Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5500860Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5501097Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5501286Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5501482Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5501683Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5501904Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5502275Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5502600Z return mod(**inputs) 2025-09-07T08:00:29.5502976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5503334Z outputs = self.bert( 2025-09-07T08:00:29.5503689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5504057Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5504429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5504788Z layer_outputs = layer_module( 2025-09-07T08:00:29.5505111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5505455Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5505818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5506185Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5506547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5506898Z return func(*args, **kwargs) 2025-09-07T08:00:29.5507248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5507603Z self_outputs = self.self( 2025-09-07T08:00:29.5507944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5508289Z return func(*args, **kwargs) 2025-09-07T08:00:29.5508637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5509059Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5509235Z 2025-09-07T08:00:29.5509319Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5509517Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5509705Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5509896Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5510086Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5510274Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5510457Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5510651Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5510842Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5511038Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5511283Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5511522Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5511718Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5511945Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5512288Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5512596Z return mod(**inputs) 2025-09-07T08:00:29.5512940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5513323Z outputs = self.bert( 2025-09-07T08:00:29.5513645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5514003Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5514351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5514704Z layer_outputs = layer_module( 2025-09-07T08:00:29.5515029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5515368Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5515721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5516082Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5516435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5516780Z return func(*args, **kwargs) 2025-09-07T08:00:29.5517110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5517457Z self_outputs = self.self( 2025-09-07T08:00:29.5517791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5518137Z return func(*args, **kwargs) 2025-09-07T08:00:29.5518476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5518879Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5519057Z 2025-09-07T08:00:29.5519132Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5519328Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5519519Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5519701Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5519888Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5520075Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5520258Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5520439Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5520625Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5520814Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5521004Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5521183Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5521369Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5521586Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5521923Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5522223Z return mod(**inputs) 2025-09-07T08:00:29.5522558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5522903Z outputs = self.bert( 2025-09-07T08:00:29.5523233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5523588Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5523961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5524348Z layer_outputs = layer_module( 2025-09-07T08:00:29.5524674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5525011Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5525356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5525726Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5526080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5526429Z return func(*args, **kwargs) 2025-09-07T08:00:29.5526771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5527119Z self_outputs = self.self( 2025-09-07T08:00:29.5527464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5527813Z return func(*args, **kwargs) 2025-09-07T08:00:29.5528158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5528558Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5528734Z 2025-09-07T08:00:29.5528812Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529009Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529201Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529388Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529571Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529762Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5529954Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5530142Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5530328Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5530525Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5530716Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5530906Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5531089Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5531309Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5531649Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5531949Z return mod(**inputs) 2025-09-07T08:00:29.5532276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5532624Z outputs = self.bert( 2025-09-07T08:00:29.5532958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5533315Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5533667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5534016Z layer_outputs = layer_module( 2025-09-07T08:00:29.5534339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5534675Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5535031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5535385Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5535740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5536086Z return func(*args, **kwargs) 2025-09-07T08:00:29.5536427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5536807Z self_outputs = self.self( 2025-09-07T08:00:29.5537199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5537554Z return func(*args, **kwargs) 2025-09-07T08:00:29.5537910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5538324Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5538496Z 2025-09-07T08:00:29.5538581Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5538780Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5538982Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5539184Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5539380Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5539572Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5539771Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5539975Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5540177Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5540367Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5540563Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5540760Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5540958Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5541177Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5541523Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5541833Z return mod(**inputs) 2025-09-07T08:00:29.5542174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5542525Z outputs = self.bert( 2025-09-07T08:00:29.5542861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5543231Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5543592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5543952Z layer_outputs = layer_module( 2025-09-07T08:00:29.5544276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5544626Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5544991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5545366Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5545728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5546075Z return func(*args, **kwargs) 2025-09-07T08:00:29.5546425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5546785Z self_outputs = self.self( 2025-09-07T08:00:29.5547128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5547475Z return func(*args, **kwargs) 2025-09-07T08:00:29.5547825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5548235Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5548409Z 2025-09-07T08:00:29.5548494Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5548697Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5548893Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5549093Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5549294Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5549488Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5549732Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5549924Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5550169Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5550361Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5550551Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5550745Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5550930Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5551154Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5551497Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5551801Z return mod(**inputs) 2025-09-07T08:00:29.5552130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5552481Z outputs = self.bert( 2025-09-07T08:00:29.5552821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5553182Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5553537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5553882Z layer_outputs = layer_module( 2025-09-07T08:00:29.5554209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5554545Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5554902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5555268Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5555612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5555956Z return func(*args, **kwargs) 2025-09-07T08:00:29.5556304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5556661Z self_outputs = self.self( 2025-09-07T08:00:29.5556990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5557337Z return func(*args, **kwargs) 2025-09-07T08:00:29.5557679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5558089Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5558255Z 2025-09-07T08:00:29.5558334Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5558522Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5558715Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5558906Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5559097Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5559284Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5559472Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5559662Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5559852Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5560037Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5560229Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5560419Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5560612Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5560824Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5561163Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5561467Z return mod(**inputs) 2025-09-07T08:00:29.5561803Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5562153Z outputs = self.bert( 2025-09-07T08:00:29.5562543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5562901Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5563249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5563604Z layer_outputs = layer_module( 2025-09-07T08:00:29.5563921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5564257Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5564616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5564983Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5565342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5565689Z return func(*args, **kwargs) 2025-09-07T08:00:29.5566029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5566381Z self_outputs = self.self( 2025-09-07T08:00:29.5566715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5567060Z return func(*args, **kwargs) 2025-09-07T08:00:29.5567401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5567807Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5567972Z 2025-09-07T08:00:29.5568055Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5568250Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5568435Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5568629Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5568827Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569015Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569194Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569387Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569578Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569769Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5569952Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5570142Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5570331Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5570549Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5570877Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5571180Z return mod(**inputs) 2025-09-07T08:00:29.5571520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5571867Z outputs = self.bert( 2025-09-07T08:00:29.5572198Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5572544Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5572896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5573252Z layer_outputs = layer_module( 2025-09-07T08:00:29.5573574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5573903Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5574258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5574618Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5575001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5576681Z return func(*args, **kwargs) 2025-09-07T08:00:29.5577018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5577374Z self_outputs = self.self( 2025-09-07T08:00:29.5577714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5578064Z return func(*args, **kwargs) 2025-09-07T08:00:29.5578395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5578802Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5578982Z 2025-09-07T08:00:29.5579053Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5579248Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5579443Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5579630Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5579817Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5580007Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5580196Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5580377Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5580569Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5580759Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5581035Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5581226Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5581421Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5581646Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5581989Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5582285Z return mod(**inputs) 2025-09-07T08:00:29.5582625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5582977Z outputs = self.bert( 2025-09-07T08:00:29.5583316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5583675Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5584015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5584366Z layer_outputs = layer_module( 2025-09-07T08:00:29.5584691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5585028Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5585376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5585735Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5586097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5586438Z return func(*args, **kwargs) 2025-09-07T08:00:29.5586777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5587115Z self_outputs = self.self( 2025-09-07T08:00:29.5587446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5587793Z return func(*args, **kwargs) 2025-09-07T08:00:29.5588138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5588541Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5588709Z 2025-09-07T08:00:29.5588783Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5589068Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5589312Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5589505Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5589689Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5589881Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5590078Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5590271Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5590454Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5590647Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5590838Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5591032Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5591214Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5591433Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5591771Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5592075Z return mod(**inputs) 2025-09-07T08:00:29.5592401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5592751Z outputs = self.bert( 2025-09-07T08:00:29.5593089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5593445Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5593792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5594134Z layer_outputs = layer_module( 2025-09-07T08:00:29.5594458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5594791Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5595142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5595511Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5595859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5596202Z return func(*args, **kwargs) 2025-09-07T08:00:29.5596540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5596891Z self_outputs = self.self( 2025-09-07T08:00:29.5597221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5597568Z return func(*args, **kwargs) 2025-09-07T08:00:29.5597908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5598310Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5598481Z 2025-09-07T08:00:29.5598563Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5598754Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5598945Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5599136Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5599323Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5599504Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5599695Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5599885Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5600074Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5600253Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5600440Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5600625Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5600812Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5601017Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5601384Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5601711Z return mod(**inputs) 2025-09-07T08:00:29.5602051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5602401Z outputs = self.bert( 2025-09-07T08:00:29.5602723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5603077Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5603427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5603777Z layer_outputs = layer_module( 2025-09-07T08:00:29.5604091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5604431Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5604789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5605154Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5605512Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5605851Z return func(*args, **kwargs) 2025-09-07T08:00:29.5606187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5606537Z self_outputs = self.self( 2025-09-07T08:00:29.5606872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5607208Z return func(*args, **kwargs) 2025-09-07T08:00:29.5607551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5607955Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5608124Z 2025-09-07T08:00:29.5608206Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5608401Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5608587Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5608781Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5608972Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5609162Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5609341Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5609529Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5609720Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5609907Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5610088Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5610281Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5610471Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5610687Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5611022Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5611325Z return mod(**inputs) 2025-09-07T08:00:29.5611658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1308, in forward 2025-09-07T08:00:29.5612010Z outputs = self.bert( 2025-09-07T08:00:29.5612341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:29.5612684Z encoder_outputs = self.encoder( 2025-09-07T08:00:29.5613030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:29.5613381Z layer_outputs = layer_module( 2025-09-07T08:00:29.5613697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:29.5614027Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:29.5614444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:29.5614807Z self_attention_outputs = self.attention( 2025-09-07T08:00:29.5615166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5615517Z return func(*args, **kwargs) 2025-09-07T08:00:29.5615859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:29.5616216Z self_outputs = self.self( 2025-09-07T08:00:29.5616560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:29.5616910Z return func(*args, **kwargs) 2025-09-07T08:00:29.5617260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:29.5617668Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:29.5617846Z 2025-09-07T08:00:29.5617925Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5618127Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5618329Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5618523Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5618719Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5618915Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5619115Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5619305Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5619505Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5619705Z cudagraph partition due to non gpu ops 2025-09-07T08:00:29.5619935Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:29.5620278Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:29.5620594Z return mod(**inputs) 2025-09-07T08:00:29.5620939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1328, in forward 2025-09-07T08:00:29.5621410Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:00:29.5621634Z 2025-09-07T08:00:38.5381667Z pass 2025-09-07T08:00:38.5382151Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:40.8946847Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:00:40.8947635Z import pynvml # type: ignore[import] 2025-09-07T08:00:43.1162196Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:00:43.1163042Z from pkg_resources import resource_filename 2025-09-07T08:00:43.6672933Z 2025-09-07T08:00:44.5957458Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:00:44.5957783Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:00:44.5958027Z cpu eval BertForQuestionAnswering 2025-09-07T08:00:44.7754235Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:44.8441792Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:44.9121691Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:00:53.5895608Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5896238Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5896574Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5896770Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897018Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897213Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897406Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897594Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897787Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5897978Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5898167Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5898347Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5898577Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5898947Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5899262Z return mod(**inputs) 2025-09-07T08:00:53.5899634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1781, in forward 2025-09-07T08:00:53.5900026Z logits = self.qa_outputs(sequence_output) 2025-09-07T08:00:53.5900173Z 2025-09-07T08:00:53.5900247Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5900449Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5900642Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5900822Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5901011Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5901200Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5901391Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5901608Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5901948Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5902258Z return mod(**inputs) 2025-09-07T08:00:53.5902603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5902956Z outputs = self.bert( 2025-09-07T08:00:53.5903299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5903658Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5904016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5904374Z layer_outputs = layer_module( 2025-09-07T08:00:53.5904696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5905035Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5905395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5905760Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5906118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5906470Z return func(*args, **kwargs) 2025-09-07T08:00:53.5906817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5907172Z self_outputs = self.self( 2025-09-07T08:00:53.5907514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5907854Z return func(*args, **kwargs) 2025-09-07T08:00:53.5908196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5908601Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5908773Z 2025-09-07T08:00:53.5908853Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5909098Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5911621Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5911816Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912009Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912203Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912388Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912579Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912768Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5912960Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5913144Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5913334Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5913522Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5913740Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5914075Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5914379Z return mod(**inputs) 2025-09-07T08:00:53.5914723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5915076Z outputs = self.bert( 2025-09-07T08:00:53.5915400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5915755Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5916108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5916464Z layer_outputs = layer_module( 2025-09-07T08:00:53.5916793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5917129Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5917483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5917852Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5918210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5918547Z return func(*args, **kwargs) 2025-09-07T08:00:53.5918888Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5919239Z self_outputs = self.self( 2025-09-07T08:00:53.5919581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5919921Z return func(*args, **kwargs) 2025-09-07T08:00:53.5920252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5920656Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5920834Z 2025-09-07T08:00:53.5920911Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5921112Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5921304Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5921489Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5921682Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5921874Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5922062Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5922245Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5922438Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5922629Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5922822Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5923005Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5923193Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5923412Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5923795Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5924129Z return mod(**inputs) 2025-09-07T08:00:53.5924466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5924811Z outputs = self.bert( 2025-09-07T08:00:53.5925137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5925482Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5925828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5926177Z layer_outputs = layer_module( 2025-09-07T08:00:53.5926505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5926844Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5927194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5927561Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5927913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5928258Z return func(*args, **kwargs) 2025-09-07T08:00:53.5928599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5928940Z self_outputs = self.self( 2025-09-07T08:00:53.5929278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5929621Z return func(*args, **kwargs) 2025-09-07T08:00:53.5929961Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5930361Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5930545Z 2025-09-07T08:00:53.5930619Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5930818Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931011Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931205Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931389Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931584Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931774Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5931966Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5932147Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5932338Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5932527Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5932719Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5932899Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5933120Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5933459Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5933765Z return mod(**inputs) 2025-09-07T08:00:53.5934097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5934445Z outputs = self.bert( 2025-09-07T08:00:53.5934777Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5935133Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5935479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5935823Z layer_outputs = layer_module( 2025-09-07T08:00:53.5936145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5936479Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5936931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5937285Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5937633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5937973Z return func(*args, **kwargs) 2025-09-07T08:00:53.5938313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5938658Z self_outputs = self.self( 2025-09-07T08:00:53.5938983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5939352Z return func(*args, **kwargs) 2025-09-07T08:00:53.5939696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5940104Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5940273Z 2025-09-07T08:00:53.5940356Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5940555Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5940742Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5940933Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5941123Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5941317Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5941498Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5941687Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5941875Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5942066Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5942251Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5942440Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5942630Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5942849Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5943183Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5943487Z return mod(**inputs) 2025-09-07T08:00:53.5943824Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5944169Z outputs = self.bert( 2025-09-07T08:00:53.5944491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5944842Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5945183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5945532Z layer_outputs = layer_module( 2025-09-07T08:00:53.5945850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5946182Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5946536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5946897Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5947251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5947594Z return func(*args, **kwargs) 2025-09-07T08:00:53.5947921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5948269Z self_outputs = self.self( 2025-09-07T08:00:53.5948601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5948943Z return func(*args, **kwargs) 2025-09-07T08:00:53.5949305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5949746Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5949923Z 2025-09-07T08:00:53.5950060Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5950257Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5950452Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5950638Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5950832Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951026Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951215Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951397Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951591Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951787Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5951981Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5952163Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5952359Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5952580Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5952919Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5953212Z return mod(**inputs) 2025-09-07T08:00:53.5953553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5953907Z outputs = self.bert( 2025-09-07T08:00:53.5954239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5954595Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5954936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5955290Z layer_outputs = layer_module( 2025-09-07T08:00:53.5955615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5955958Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5956309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5956676Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5957036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5957383Z return func(*args, **kwargs) 2025-09-07T08:00:53.5957724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5958065Z self_outputs = self.self( 2025-09-07T08:00:53.5958401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5958746Z return func(*args, **kwargs) 2025-09-07T08:00:53.5959093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5959499Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5959669Z 2025-09-07T08:00:53.5959744Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5959942Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5960136Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5960328Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5960511Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5960706Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5960896Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5961087Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5961271Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5961463Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5961657Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5961915Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5962101Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5962322Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5962663Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5962970Z return mod(**inputs) 2025-09-07T08:00:53.5963300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5963655Z outputs = self.bert( 2025-09-07T08:00:53.5963985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5964346Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5964695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5965044Z layer_outputs = layer_module( 2025-09-07T08:00:53.5965371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5965706Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5966057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5966422Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5966768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5967110Z return func(*args, **kwargs) 2025-09-07T08:00:53.5967450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5967794Z self_outputs = self.self( 2025-09-07T08:00:53.5968121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5968471Z return func(*args, **kwargs) 2025-09-07T08:00:53.5968810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5969211Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5969377Z 2025-09-07T08:00:53.5969457Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5969648Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5969838Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970035Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970229Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970411Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970600Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970788Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5970978Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5971159Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5971357Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5971546Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5971736Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5971952Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5972288Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5972589Z return mod(**inputs) 2025-09-07T08:00:53.5972925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5973271Z outputs = self.bert( 2025-09-07T08:00:53.5973595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5973947Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5974326Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5974704Z layer_outputs = layer_module( 2025-09-07T08:00:53.5975017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5975354Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5975709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5976071Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5976427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5976764Z return func(*args, **kwargs) 2025-09-07T08:00:53.5977107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5977456Z self_outputs = self.self( 2025-09-07T08:00:53.5977794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5978134Z return func(*args, **kwargs) 2025-09-07T08:00:53.5978475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5978879Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5979049Z 2025-09-07T08:00:53.5979132Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5979326Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5979515Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5979708Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5979895Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5980086Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5980273Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5980462Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5980660Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5980856Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5981119Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5981312Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5981506Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5981728Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5982062Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5982364Z return mod(**inputs) 2025-09-07T08:00:53.5982700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5983048Z outputs = self.bert( 2025-09-07T08:00:53.5983381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5983731Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5984085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5984441Z layer_outputs = layer_module( 2025-09-07T08:00:53.5984760Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5985087Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5985440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5985800Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5986157Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5986499Z return func(*args, **kwargs) 2025-09-07T08:00:53.5986831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5987284Z self_outputs = self.self( 2025-09-07T08:00:53.5987672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5988018Z return func(*args, **kwargs) 2025-09-07T08:00:53.5988349Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5988753Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5988930Z 2025-09-07T08:00:53.5989004Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5989204Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5989396Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5989580Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5989774Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5989964Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5990159Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5990347Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5990541Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5990730Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5990934Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5991121Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5991316Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5991533Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.5991873Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.5992166Z return mod(**inputs) 2025-09-07T08:00:53.5992501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.5992847Z outputs = self.bert( 2025-09-07T08:00:53.5993172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.5993525Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.5993864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.5994214Z layer_outputs = layer_module( 2025-09-07T08:00:53.5994535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.5994869Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.5995211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.5995567Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.5995918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5996257Z return func(*args, **kwargs) 2025-09-07T08:00:53.5996595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.5996934Z self_outputs = self.self( 2025-09-07T08:00:53.5997267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.5997606Z return func(*args, **kwargs) 2025-09-07T08:00:53.5997939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.5998334Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.5998500Z 2025-09-07T08:00:53.5998571Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5998762Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5998958Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5999146Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5999326Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5999515Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5999742Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.5999963Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6000145Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6000340Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6000532Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6000725Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6000908Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6001124Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.6001462Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.6001763Z return mod(**inputs) 2025-09-07T08:00:53.6002090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.6002443Z outputs = self.bert( 2025-09-07T08:00:53.6002780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.6003142Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.6003492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.6003835Z layer_outputs = layer_module( 2025-09-07T08:00:53.6004161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.6004496Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.6004856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.6005222Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.6005567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.6005921Z return func(*args, **kwargs) 2025-09-07T08:00:53.6006264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.6006618Z self_outputs = self.self( 2025-09-07T08:00:53.6006942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.6007286Z return func(*args, **kwargs) 2025-09-07T08:00:53.6007624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.6008027Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.6008196Z 2025-09-07T08:00:53.6008277Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6008468Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6008662Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6008851Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009040Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009226Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009421Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009613Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009801Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6009983Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6010173Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6010359Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6010549Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6010756Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.6011091Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.6011390Z return mod(**inputs) 2025-09-07T08:00:53.6011725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1767, in forward 2025-09-07T08:00:53.6012076Z outputs = self.bert( 2025-09-07T08:00:53.6012703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1028, in forward 2025-09-07T08:00:53.6013103Z encoder_outputs = self.encoder( 2025-09-07T08:00:53.6013463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 675, in forward 2025-09-07T08:00:53.6013818Z layer_outputs = layer_module( 2025-09-07T08:00:53.6014132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:00:53.6014475Z return super().__call__(*args, **kwargs) 2025-09-07T08:00:53.6014834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 584, in forward 2025-09-07T08:00:53.6015201Z self_attention_outputs = self.attention( 2025-09-07T08:00:53.6015558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.6015902Z return func(*args, **kwargs) 2025-09-07T08:00:53.6016249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 514, in forward 2025-09-07T08:00:53.6016598Z self_outputs = self.self( 2025-09-07T08:00:53.6016934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:00:53.6017273Z return func(*args, **kwargs) 2025-09-07T08:00:53.6017617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 438, in forward 2025-09-07T08:00:53.6018021Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:00:53.6018190Z 2025-09-07T08:00:53.6018274Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6018476Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6018665Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6018858Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6019054Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6019243Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6019427Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6019619Z cudagraph partition due to non gpu ops 2025-09-07T08:00:53.6019844Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.6020182Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.6020480Z return mod(**inputs) 2025-09-07T08:00:53.6020825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1799, in forward 2025-09-07T08:00:53.6021212Z start_loss = loss_fct(start_logits, start_positions) 2025-09-07T08:00:53.6021355Z 2025-09-07T08:00:53.6021463Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:00:53.6021793Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:00:53.6022092Z return mod(**inputs) 2025-09-07T08:00:53.6022429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 1800, in forward 2025-09-07T08:00:53.6022803Z end_loss = loss_fct(end_logits, end_positions) 2025-09-07T08:00:53.6022944Z 2025-09-07T08:01:02.4690949Z pass 2025-09-07T08:01:02.4691370Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:04.7167255Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:01:04.7168140Z import pynvml # type: ignore[import] 2025-09-07T08:01:06.9432316Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:01:06.9433322Z from pkg_resources import resource_filename 2025-09-07T08:01:07.5295190Z 2025-09-07T08:01:25.4483450Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:01:25.4483808Z loading model: 0it [00:17, ?it/s] 2025-09-07T08:01:25.4484101Z cpu eval BlenderbotForCausalLM 2025-09-07T08:01:25.8728106Z pass_due_to_skip 2025-09-07T08:01:25.8728489Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:27.7859803Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:01:27.7860602Z import pynvml # type: ignore[import] 2025-09-07T08:01:30.0135977Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:01:30.0136830Z from pkg_resources import resource_filename 2025-09-07T08:01:30.5616417Z 2025-09-07T08:01:31.2398125Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:01:31.2398790Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:01:31.2405012Z cpu eval BlenderbotSmallForCausalLM 2025-09-07T08:01:31.3325318Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:31.3744273Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:31.4136370Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:38.5939789Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5940076Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5940281Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5940471Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5940698Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5940972Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5941257Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5941497Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5941700Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5941912Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5942128Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5942310Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5942529Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5942743Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5942980Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5943179Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5943402Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5943618Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5943864Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5944221Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5944613Z return mod(**inputs) 2025-09-07T08:01:38.5945085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5945539Z outputs = self.model.decoder( 2025-09-07T08:01:38.5946012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5946467Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5947139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5947612Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5948059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5948537Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5949003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5949471Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5949916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.5950414Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.5950597Z 2025-09-07T08:01:38.5950708Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5951046Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5951353Z return mod(**inputs) 2025-09-07T08:01:38.5951770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5952297Z outputs = self.model.decoder( 2025-09-07T08:01:38.5952739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5953150Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5953497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5953869Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5954313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5954794Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5955241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5955720Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5956154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.5956600Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.5956763Z 2025-09-07T08:01:38.5956853Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5957045Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5957266Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5957462Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5957679Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5957862Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958051Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958239Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958428Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958607Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958800Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5958993Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5959183Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5959365Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5959555Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5959743Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5959960Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5960286Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5960635Z return mod(**inputs) 2025-09-07T08:01:38.5961706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5962114Z outputs = self.model.decoder( 2025-09-07T08:01:38.5962518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5962913Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5963242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5963579Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5963987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5964411Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5964830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5965256Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5965662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.5966102Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.5966274Z 2025-09-07T08:01:38.5966377Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5966702Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5967002Z return mod(**inputs) 2025-09-07T08:01:38.5967386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5967788Z outputs = self.model.decoder( 2025-09-07T08:01:38.5968176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5968574Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5968893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5969224Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5969630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5970047Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5970467Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5970889Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5971296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.5971721Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.5971867Z 2025-09-07T08:01:38.5971941Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5972137Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5972329Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5972523Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5972706Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5972898Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5973087Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5973275Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5973456Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5973640Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5973827Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5974076Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5974297Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5974485Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5974675Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5974869Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5975081Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5975422Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5975730Z return mod(**inputs) 2025-09-07T08:01:38.5976124Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5976537Z outputs = self.model.decoder( 2025-09-07T08:01:38.5976935Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5977346Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5977672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5978010Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5978422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5978842Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5979273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5979703Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5980109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.5980554Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.5980729Z 2025-09-07T08:01:38.5980825Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5981343Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5981648Z return mod(**inputs) 2025-09-07T08:01:38.5982037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5982432Z outputs = self.model.decoder( 2025-09-07T08:01:38.5982835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5983238Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5983562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5983898Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5984306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5984733Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5985158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5985578Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5985985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.5986397Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.5986553Z 2025-09-07T08:01:38.5986625Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5986825Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5987018Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5987282Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5987554Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5987743Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5987935Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5988120Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5988314Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5988512Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5988706Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5988895Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5989081Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5989272Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5989463Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5989655Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.5989863Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5990200Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5990503Z return mod(**inputs) 2025-09-07T08:01:38.5990887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5991283Z outputs = self.model.decoder( 2025-09-07T08:01:38.5991682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5992084Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5992411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5992750Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5993149Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5993575Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5994007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.5994428Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.5994829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.5995261Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.5995438Z 2025-09-07T08:01:38.5995533Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.5995864Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.5996159Z return mod(**inputs) 2025-09-07T08:01:38.5996535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.5996943Z outputs = self.model.decoder( 2025-09-07T08:01:38.5997341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.5997744Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.5998066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.5998391Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.5998798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.5999221Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.5999646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6000069Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6000543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.6000967Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.6001118Z 2025-09-07T08:01:38.6001192Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6001389Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6001586Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6001771Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6001961Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6002155Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6002349Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6002534Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6002726Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6002920Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6003110Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6003299Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6003491Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6003684Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6003875Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6004073Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6004291Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6004622Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6004928Z return mod(**inputs) 2025-09-07T08:01:38.6005310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6005721Z outputs = self.model.decoder( 2025-09-07T08:01:38.6006122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6006535Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6006863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6007194Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6007605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6008031Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6008457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6008883Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6009279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.6009721Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.6009900Z 2025-09-07T08:01:38.6009996Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6010330Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6010634Z return mod(**inputs) 2025-09-07T08:01:38.6011008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6011412Z outputs = self.model.decoder( 2025-09-07T08:01:38.6011811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6012214Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6012529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6012899Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6013337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6013767Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6014192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6014609Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6015019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.6015446Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.6015594Z 2025-09-07T08:01:38.6015676Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6015872Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6016059Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6016255Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6016450Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6016640Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6016823Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017017Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017212Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017401Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017588Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017781Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6017974Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6018164Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6018346Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6018539Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6018758Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6019093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6019389Z return mod(**inputs) 2025-09-07T08:01:38.6019780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6020191Z outputs = self.model.decoder( 2025-09-07T08:01:38.6020593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6020999Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6021316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6021655Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6022066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6022497Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6022926Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6023348Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6023755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.6024198Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.6024366Z 2025-09-07T08:01:38.6024468Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6024802Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6025096Z return mod(**inputs) 2025-09-07T08:01:38.6025513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6025954Z outputs = self.model.decoder( 2025-09-07T08:01:38.6026357Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6026755Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6027082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6027417Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6027821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6028244Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6028660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6029089Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6029501Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.6029925Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.6030071Z 2025-09-07T08:01:38.6030152Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6030341Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6030536Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6030730Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6030920Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6031102Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6031295Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6031485Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6031673Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6031857Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6032052Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6032239Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6032468Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6032660Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6032840Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6033028Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6033245Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6033582Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6033888Z return mod(**inputs) 2025-09-07T08:01:38.6034271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6034682Z outputs = self.model.decoder( 2025-09-07T08:01:38.6035084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6035488Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6035805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6036143Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6036549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6036979Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6037400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6037813Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6038223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.6038751Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.6038919Z 2025-09-07T08:01:38.6039025Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6039353Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6039644Z return mod(**inputs) 2025-09-07T08:01:38.6040030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6040433Z outputs = self.model.decoder( 2025-09-07T08:01:38.6040825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6041226Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6041543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6041885Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6042294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6042717Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6043144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6043560Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6043966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.6044382Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.6044529Z 2025-09-07T08:01:38.6044609Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6044804Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6044997Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6045190Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6045379Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6045571Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6045754Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6045946Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6046137Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6046326Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6046508Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6046697Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6046889Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6047081Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6047265Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6047455Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6047671Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6048008Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6048301Z return mod(**inputs) 2025-09-07T08:01:38.6048683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6049090Z outputs = self.model.decoder( 2025-09-07T08:01:38.6049489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6049895Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6050208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6050543Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6050977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6051439Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6051857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6052283Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6052686Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:01:38.6053123Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:01:38.6053288Z 2025-09-07T08:01:38.6053392Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6053715Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6054017Z return mod(**inputs) 2025-09-07T08:01:38.6054401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1512, in forward 2025-09-07T08:01:38.6054809Z outputs = self.model.decoder( 2025-09-07T08:01:38.6055206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:01:38.6055599Z layer_outputs = decoder_layer( 2025-09-07T08:01:38.6055921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:01:38.6056256Z return super().__call__(*args, **kwargs) 2025-09-07T08:01:38.6056661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:01:38.6057086Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:01:38.6057504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:01:38.6057931Z attn_output, attn_weights = attention_interface( 2025-09-07T08:01:38.6058338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:01:38.6058759Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:01:38.6058906Z 2025-09-07T08:01:38.6058987Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6059173Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6059367Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6059561Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6059752Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6059930Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6060120Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6060310Z cudagraph partition due to non gpu ops 2025-09-07T08:01:38.6060525Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:01:38.6060855Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:01:38.6061157Z return mod(**inputs) 2025-09-07T08:01:38.6061545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1534, in forward 2025-09-07T08:01:38.6062017Z loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:01:38.6062197Z 2025-09-07T08:01:47.1115324Z pass 2025-09-07T08:01:47.1116237Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:49.3932774Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:01:49.3934120Z import pynvml # type: ignore[import] 2025-09-07T08:01:51.6162069Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:01:51.6162915Z from pkg_resources import resource_filename 2025-09-07T08:01:52.2059618Z 2025-09-07T08:01:53.1099091Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:01:53.1100299Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:01:53.1101353Z cpu eval BlenderbotSmallForConditionalGeneration 2025-09-07T08:01:53.2713273Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:53.3539447Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:01:53.4354144Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:05.5470395Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5470742Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5470991Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5471238Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5471469Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5471698Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5471932Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5472166Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5472406Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5472628Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5472894Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5473138Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5473362Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5473596Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5473855Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5474101Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5474328Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5474559Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5474798Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5475242Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5475606Z return mod(**inputs) 2025-09-07T08:02:05.5476079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5476569Z outputs = self.model( 2025-09-07T08:02:05.5477006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5477550Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5478029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5478501Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5478912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5479349Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5479864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5480406Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5480902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5481582Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5482244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5482804Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5482991Z 2025-09-07T08:02:05.5483114Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5483519Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5483941Z return mod(**inputs) 2025-09-07T08:02:05.5484360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5484850Z outputs = self.model( 2025-09-07T08:02:05.5485363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5485895Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5486395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5486949Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5487345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5487727Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5488202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5488723Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5489271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5489784Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5490279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5490833Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5491052Z 2025-09-07T08:02:05.5491156Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5491412Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5491636Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5491887Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5492104Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5492336Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5492560Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5492822Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5493050Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5493266Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5493512Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5493742Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5493964Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5494229Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5494489Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5494721Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5494972Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5495427Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5495804Z return mod(**inputs) 2025-09-07T08:02:05.5496259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5496791Z outputs = self.model( 2025-09-07T08:02:05.5497296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5497752Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5498325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5498861Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5499281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5499641Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5500168Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5500724Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5501206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5501735Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5502215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5502800Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5503000Z 2025-09-07T08:02:05.5503136Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5503535Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5503857Z return mod(**inputs) 2025-09-07T08:02:05.5504388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5504840Z outputs = self.model( 2025-09-07T08:02:05.5505292Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5505739Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5506204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5506683Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5507027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5507447Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5507968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5508529Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5509064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5509608Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5510107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5510630Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5510832Z 2025-09-07T08:02:05.5510916Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5511171Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5511405Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5511631Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5511851Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5512085Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5512320Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5512553Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5512792Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5513040Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5513252Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5513489Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5513725Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5514047Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5514282Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5514537Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5514951Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5515398Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5515753Z return mod(**inputs) 2025-09-07T08:02:05.5516260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5516737Z outputs = self.model( 2025-09-07T08:02:05.5517202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5517663Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5518182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5518725Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5519185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5519664Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5520191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5520712Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5521232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5521807Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5522356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5522858Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5523053Z 2025-09-07T08:02:05.5523165Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5523539Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5523874Z return mod(**inputs) 2025-09-07T08:02:05.5524296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5524774Z outputs = self.model( 2025-09-07T08:02:05.5525199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5525645Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5526138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5526587Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5526955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5527296Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5527724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5528166Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5528604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5529039Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5529471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5530001Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5530155Z 2025-09-07T08:02:05.5530240Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5530449Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5530638Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5530834Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531029Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531221Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531409Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531610Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531807Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5531999Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5532188Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5532384Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5532573Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5532772Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5532960Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5533157Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5533379Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5533725Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5534030Z return mod(**inputs) 2025-09-07T08:02:05.5534435Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5534853Z outputs = self.model( 2025-09-07T08:02:05.5535250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5535666Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5536078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5536495Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5536831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5537172Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5537594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5538018Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5538452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5538890Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5539310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5539770Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5539943Z 2025-09-07T08:02:05.5540044Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5540401Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5540710Z return mod(**inputs) 2025-09-07T08:02:05.5541107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5541517Z outputs = self.model( 2025-09-07T08:02:05.5541915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5542328Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5542780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5543227Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5543549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5543893Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5544312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5544744Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5545173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5545600Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5546018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5546457Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5546610Z 2025-09-07T08:02:05.5546694Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5546894Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5547083Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5547278Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5547472Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5547662Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5547849Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548041Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548236Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548425Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548611Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548804Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5548996Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5549188Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5549381Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5549576Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5549797Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5550141Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5550448Z return mod(**inputs) 2025-09-07T08:02:05.5550846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5551257Z outputs = self.model( 2025-09-07T08:02:05.5551649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5552070Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5552479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5552899Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5553233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5553577Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5553986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5554420Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5554847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5555282Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5555698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5556171Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5556389Z 2025-09-07T08:02:05.5556491Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5556834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5557145Z return mod(**inputs) 2025-09-07T08:02:05.5557540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5557946Z outputs = self.model( 2025-09-07T08:02:05.5558341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5558757Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5559170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5559585Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5559911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5560253Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5560670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5561096Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5561519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5561945Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5562360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5562794Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5562947Z 2025-09-07T08:02:05.5563032Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5563227Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5563424Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5563621Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5563816Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564008Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564193Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564385Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564578Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564770Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5564953Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5565148Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5565340Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5565534Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5565724Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5565922Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5566142Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5566486Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5566797Z return mod(**inputs) 2025-09-07T08:02:05.5567196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5567611Z outputs = self.model( 2025-09-07T08:02:05.5568009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5568431Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5568872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5569320Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5569656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5570000Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5570410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5570844Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5571277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5571708Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5572130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5572576Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5572758Z 2025-09-07T08:02:05.5572857Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5573201Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5573509Z return mod(**inputs) 2025-09-07T08:02:05.5573904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5574306Z outputs = self.model( 2025-09-07T08:02:05.5574697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5575107Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5575515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5575928Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5576253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5576592Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5577012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5577438Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5577857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5578293Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5578709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5579141Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5579298Z 2025-09-07T08:02:05.5579380Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5579571Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5579770Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5579962Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5580154Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5580339Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5580533Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5580730Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5580984Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5581196Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5581391Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5581591Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5581792Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5581981Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5582269Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5583250Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5583479Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5583829Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5584137Z return mod(**inputs) 2025-09-07T08:02:05.5584544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5584966Z outputs = self.model( 2025-09-07T08:02:05.5585364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5585773Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5586183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5586599Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5586982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5587313Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5587710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5588128Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5588547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5588970Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5589373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5589807Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5589981Z 2025-09-07T08:02:05.5590077Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5590413Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5590715Z return mod(**inputs) 2025-09-07T08:02:05.5591100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5591496Z outputs = self.model( 2025-09-07T08:02:05.5591882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5592285Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5592684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5593087Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5593405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5593735Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5594145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5594566Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5594972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5595399Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5595805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5596221Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5596368Z 2025-09-07T08:02:05.5596514Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5596702Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5596897Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5597090Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5597280Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5597464Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5597655Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5597845Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598039Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598225Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598419Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598610Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598799Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5598981Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5599176Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5599374Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5599594Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5599925Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5600215Z return mod(**inputs) 2025-09-07T08:02:05.5600600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5601001Z outputs = self.model( 2025-09-07T08:02:05.5601385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5601780Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5602177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5602581Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5602912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5603246Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5603648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5604070Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5604484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5604905Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5605309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5605738Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5605919Z 2025-09-07T08:02:05.5606015Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5606347Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5606647Z return mod(**inputs) 2025-09-07T08:02:05.5607032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5607426Z outputs = self.model( 2025-09-07T08:02:05.5607809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1195, in forward 2025-09-07T08:02:05.5608211Z encoder_outputs = self.encoder( 2025-09-07T08:02:05.5608610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 812, in forward 2025-09-07T08:02:05.5609002Z layer_outputs = encoder_layer( 2025-09-07T08:02:05.5609396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5609738Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5610151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 296, in forward 2025-09-07T08:02:05.5610576Z hidden_states, attn_weights = self.self_attn( 2025-09-07T08:02:05.5610987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5611420Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5611830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5612258Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5612409Z 2025-09-07T08:02:05.5612496Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5612693Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5612897Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5613100Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5613295Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5613485Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5613680Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5613878Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5614076Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5614269Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5614466Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5614661Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5614853Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5615039Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5615236Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5615433Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5615657Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5615989Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5616293Z return mod(**inputs) 2025-09-07T08:02:05.5616692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5617102Z outputs = self.model( 2025-09-07T08:02:05.5617494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5617900Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5618303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5618713Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5619047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5619388Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5619792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5620226Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5620658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5621088Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5621500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5621935Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5622143Z 2025-09-07T08:02:05.5622270Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5622598Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5622895Z return mod(**inputs) 2025-09-07T08:02:05.5623271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5623673Z outputs = self.model( 2025-09-07T08:02:05.5624058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5624459Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5624856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5625249Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5625576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5625915Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5626319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5626743Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5627162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5627585Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5627994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5628414Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5628561Z 2025-09-07T08:02:05.5628642Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5628836Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629029Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629219Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629412Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629595Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629786Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5629976Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5630165Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5630346Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5630537Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5630724Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5630941Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5631266Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5631569Z return mod(**inputs) 2025-09-07T08:02:05.5631968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5632384Z outputs = self.model( 2025-09-07T08:02:05.5632783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5633190Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5633602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5634021Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5634359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5634702Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5635166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5635651Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5636088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5636511Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5636911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5637355Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5637535Z 2025-09-07T08:02:05.5637631Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5637963Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5638267Z return mod(**inputs) 2025-09-07T08:02:05.5638654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5639051Z outputs = self.model( 2025-09-07T08:02:05.5639434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5639834Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5640232Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5640631Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5640960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5641296Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5641710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5642151Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5642578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5643008Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5643418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5643836Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5643984Z 2025-09-07T08:02:05.5644066Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5644257Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5644453Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5644644Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5644840Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645029Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645223Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645414Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645611Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645794Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5645983Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5646174Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5646364Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5646548Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5646739Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5646934Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5647154Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5647482Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5647822Z return mod(**inputs) 2025-09-07T08:02:05.5648241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5648639Z outputs = self.model( 2025-09-07T08:02:05.5649024Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5649419Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5649825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5650227Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5650553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5650880Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5651290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5651717Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5652142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5652567Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5652967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5653401Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5653574Z 2025-09-07T08:02:05.5653667Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5654000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5654300Z return mod(**inputs) 2025-09-07T08:02:05.5654679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5655081Z outputs = self.model( 2025-09-07T08:02:05.5655469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5655874Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5656273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5656671Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5657001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5657334Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5657750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5658179Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5658595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5659022Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5659432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5659852Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5659998Z 2025-09-07T08:02:05.5660079Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5660267Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5660463Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5660660Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5660886Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5661107Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5661300Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5661492Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5661681Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5661864Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5662055Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5662247Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5662461Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5662788Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5663089Z return mod(**inputs) 2025-09-07T08:02:05.5663477Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5663877Z outputs = self.model( 2025-09-07T08:02:05.5664258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5664660Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5665061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5665459Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5665785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5666114Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5666522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5666958Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5667394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5667823Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5668218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5668653Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5668826Z 2025-09-07T08:02:05.5668924Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5669258Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5669562Z return mod(**inputs) 2025-09-07T08:02:05.5669940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5670339Z outputs = self.model( 2025-09-07T08:02:05.5670722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5671130Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5671528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5671927Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5672251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5672586Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5672995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5673421Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5673892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5674367Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5674776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5675192Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5675336Z 2025-09-07T08:02:05.5675409Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5696663Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5697036Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5697245Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5697440Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5697628Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5697823Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698018Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698219Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698412Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698606Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698797Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5698988Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5699174Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5699370Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5699567Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5699794Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5700143Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5700461Z return mod(**inputs) 2025-09-07T08:02:05.5700891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5701312Z outputs = self.model( 2025-09-07T08:02:05.5701710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5702133Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5702539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5702947Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5703281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5703625Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5704037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5704461Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5704897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5705333Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5705744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5706196Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5706369Z 2025-09-07T08:02:05.5706471Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5706810Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5707116Z return mod(**inputs) 2025-09-07T08:02:05.5707507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5707910Z outputs = self.model( 2025-09-07T08:02:05.5708570Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5709057Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5709469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5709877Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5710200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5710541Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5710956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5711389Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5711827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5712252Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5712662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5713088Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5713238Z 2025-09-07T08:02:05.5713321Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5713522Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5713711Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5713903Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5714094Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5714283Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5714463Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5714652Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5714841Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5715033Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5715215Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5715407Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5715630Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5715970Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5716267Z return mod(**inputs) 2025-09-07T08:02:05.5716656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5717058Z outputs = self.model( 2025-09-07T08:02:05.5717444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5717852Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5718251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5718663Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5718991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5719329Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5719740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5720170Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5720601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5721028Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5721472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5721949Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5722119Z 2025-09-07T08:02:05.5722217Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5722553Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5722855Z return mod(**inputs) 2025-09-07T08:02:05.5723235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5723631Z outputs = self.model( 2025-09-07T08:02:05.5724014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5724416Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5724816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5725223Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5725538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5725876Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5726285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5726722Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5727153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5727573Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5727982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5728408Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5728555Z 2025-09-07T08:02:05.5728638Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5728835Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729024Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729216Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729411Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729605Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729786Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5729979Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5730172Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5730362Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5730545Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5730736Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5730925Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5731118Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5731302Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5731493Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5731712Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5732048Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5732343Z return mod(**inputs) 2025-09-07T08:02:05.5732733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5733134Z outputs = self.model( 2025-09-07T08:02:05.5733517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5733926Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5734370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5734804Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5735129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5735464Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5735865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5736289Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5736717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5737140Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5737549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5737983Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5738162Z 2025-09-07T08:02:05.5738259Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5738594Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5738889Z return mod(**inputs) 2025-09-07T08:02:05.5739275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5739666Z outputs = self.model( 2025-09-07T08:02:05.5740047Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5740451Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5740853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5741262Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5741578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5741911Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5742318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5742747Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5743172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5743588Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5743998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5744415Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5744574Z 2025-09-07T08:02:05.5744647Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5744844Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745039Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745222Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745416Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745606Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745795Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5745976Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5746168Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5746359Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5746549Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5746739Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5746949Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5747344Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5747649Z return mod(**inputs) 2025-09-07T08:02:05.5748040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5748440Z outputs = self.model( 2025-09-07T08:02:05.5748828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5749232Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5749636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5750036Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5750360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5750699Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5751107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5751542Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5751969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5752399Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5752806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5753246Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5753415Z 2025-09-07T08:02:05.5753520Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5753848Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5754151Z return mod(**inputs) 2025-09-07T08:02:05.5754534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5754931Z outputs = self.model( 2025-09-07T08:02:05.5755316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5755717Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5756120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5756525Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5756853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5757195Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5757598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5758033Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5758465Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5758888Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5759293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5759712Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5759869Z 2025-09-07T08:02:05.5759943Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5760138Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5760367Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5760581Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5760778Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5760973Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5761170Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5761355Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5761548Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5761740Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5761933Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5762113Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5762306Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5762496Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5762685Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5762866Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5763085Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5763426Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5763726Z return mod(**inputs) 2025-09-07T08:02:05.5764117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5764513Z outputs = self.model( 2025-09-07T08:02:05.5764900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5765305Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5765705Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5766111Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5766430Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5766771Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5767181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5767606Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5768027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5768455Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5768862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5769303Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5769469Z 2025-09-07T08:02:05.5769574Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5769905Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5770211Z return mod(**inputs) 2025-09-07T08:02:05.5770596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5770996Z outputs = self.model( 2025-09-07T08:02:05.5771379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5771776Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5772175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5772582Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5772909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5773281Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5773728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5774162Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5774595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5775028Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5775440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5775855Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5776010Z 2025-09-07T08:02:05.5776084Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5776284Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5776477Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5776668Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5776866Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5777061Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5777252Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5777439Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5777633Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5777824Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5778016Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5778196Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5778417Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5778751Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5779052Z return mod(**inputs) 2025-09-07T08:02:05.5779444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5779849Z outputs = self.model( 2025-09-07T08:02:05.5780243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5780651Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5781153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5781563Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5781902Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5782252Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5782664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5783105Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5783535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5783965Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5784371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5784812Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5784979Z 2025-09-07T08:02:05.5785077Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5785404Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5785704Z return mod(**inputs) 2025-09-07T08:02:05.5786146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5786599Z outputs = self.model( 2025-09-07T08:02:05.5786990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5787386Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5787796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5788207Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5788536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5788866Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5789277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5789715Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5790153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5790580Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5790978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5791396Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5791551Z 2025-09-07T08:02:05.5791625Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5791823Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792017Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792199Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792392Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792584Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792776Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5792960Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5793149Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5793337Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5793525Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5793705Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5793893Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5794085Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5794275Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5794455Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5794673Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5795010Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5795315Z return mod(**inputs) 2025-09-07T08:02:05.5795704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5796102Z outputs = self.model( 2025-09-07T08:02:05.5796482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5796885Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5797276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5797677Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5798003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5798335Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5798734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5799194Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5799655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5800080Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5800488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5800920Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5801095Z 2025-09-07T08:02:05.5801190Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5801519Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5801820Z return mod(**inputs) 2025-09-07T08:02:05.5802208Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5802602Z outputs = self.model( 2025-09-07T08:02:05.5802987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5803390Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5803790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5804192Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5804509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5804840Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5805246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5805670Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5806087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5806514Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5806919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5807341Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5807487Z 2025-09-07T08:02:05.5807568Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5807756Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5807951Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5808138Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5808328Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5808507Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5808701Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5808891Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5809081Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5809264Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5809456Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5809647Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5809864Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5810200Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5810492Z return mod(**inputs) 2025-09-07T08:02:05.5810877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5811270Z outputs = self.model( 2025-09-07T08:02:05.5811655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5812130Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5812532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5812931Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5813255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5813584Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5813984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5814420Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5814856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5815286Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5815697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5816127Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5816300Z 2025-09-07T08:02:05.5816396Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5816728Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5817028Z return mod(**inputs) 2025-09-07T08:02:05.5817407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5817796Z outputs = self.model( 2025-09-07T08:02:05.5818176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5818576Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5818976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5819368Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5819683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5820011Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5820417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5820852Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5821274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5821695Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5822101Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5822520Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5822665Z 2025-09-07T08:02:05.5822743Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5822931Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5823118Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5823347Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5823537Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5823725Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5823914Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5824096Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5824283Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5824471Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5824695Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5824917Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5825109Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5825298Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5825487Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5825677Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5825889Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5826225Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5826532Z return mod(**inputs) 2025-09-07T08:02:05.5826920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5827311Z outputs = self.model( 2025-09-07T08:02:05.5827700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5828107Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5828513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5828911Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5829224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5829560Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5829967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5830393Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5830814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5831236Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5831644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5832078Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5832246Z 2025-09-07T08:02:05.5832349Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5832668Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5832965Z return mod(**inputs) 2025-09-07T08:02:05.5833346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5833738Z outputs = self.model( 2025-09-07T08:02:05.5834117Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5834514Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5834915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5835319Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5835644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5835976Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5836376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5836803Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5837228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5837318Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5837660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5837761Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5837765Z 2025-09-07T08:02:05.5837845Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5837916Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5837988Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838067Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838136Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838216Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838286Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838356Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838431Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838501Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838576Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838652Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5838748Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5838941Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5839002Z return mod(**inputs) 2025-09-07T08:02:05.5839291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5839356Z outputs = self.model( 2025-09-07T08:02:05.5839634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5839709Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5839988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5840063Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5840268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5840341Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5840630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5840730Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5841014Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5841101Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5841370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5841488Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5841497Z 2025-09-07T08:02:05.5841591Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5841781Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5841841Z return mod(**inputs) 2025-09-07T08:02:05.5842125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5842188Z outputs = self.model( 2025-09-07T08:02:05.5842472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5842539Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5842817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5842893Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5843155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5843238Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5843521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5843619Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5843911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5844000Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5844275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5844373Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5844376Z 2025-09-07T08:02:05.5844460Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844531Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844599Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844677Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844745Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844822Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844889Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5844957Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845032Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845100Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845167Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845241Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845308Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845383Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845449Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845523Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5845624Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5845807Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5845877Z return mod(**inputs) 2025-09-07T08:02:05.5846156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5846219Z outputs = self.model( 2025-09-07T08:02:05.5846505Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5846571Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5846860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5846929Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5847140Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5847212Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5847489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5847587Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5847862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5847957Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5848219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5848345Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5848412Z 2025-09-07T08:02:05.5848507Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5848691Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5848760Z return mod(**inputs) 2025-09-07T08:02:05.5849040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5849111Z outputs = self.model( 2025-09-07T08:02:05.5849389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5849456Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5849744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5849810Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5850027Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5850101Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5850385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 398, in forward 2025-09-07T08:02:05.5850476Z hidden_states, self_attn_weights = self.self_attn( 2025-09-07T08:02:05.5850750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5850846Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5851107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5851213Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5851216Z 2025-09-07T08:02:05.5851294Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851365Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851443Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851512Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851588Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851655Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851722Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851797Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851866Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5851944Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5852011Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5852078Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5852179Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5852361Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5852435Z return mod(**inputs) 2025-09-07T08:02:05.5852714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5852779Z outputs = self.model( 2025-09-07T08:02:05.5853063Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5853129Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5853411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5853478Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5853680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5853759Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5854077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5854214Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5854489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5854585Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5854846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 81, in sdpa_attention_forward 2025-09-07T08:02:05.5854964Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:05.5854968Z 2025-09-07T08:02:05.5855065Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5855247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5855316Z return mod(**inputs) 2025-09-07T08:02:05.5855597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1375, in forward 2025-09-07T08:02:05.5855664Z outputs = self.model( 2025-09-07T08:02:05.5855943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1213, in forward 2025-09-07T08:02:05.5856008Z decoder_outputs = self.decoder( 2025-09-07T08:02:05.5856294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1057, in forward 2025-09-07T08:02:05.5856360Z layer_outputs = decoder_layer( 2025-09-07T08:02:05.5856569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:05.5856642Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:05.5856922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 415, in forward 2025-09-07T08:02:05.5857028Z hidden_states, cross_attn_weights = self.encoder_attn( 2025-09-07T08:02:05.5857304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 237, in forward 2025-09-07T08:02:05.5857394Z attn_output, attn_weights = attention_interface( 2025-09-07T08:02:05.5857655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/integrations/sdpa_attention.py", line 91, in sdpa_attention_forward 2025-09-07T08:02:05.5857759Z attn_output = attn_output.transpose(1, 2).contiguous() 2025-09-07T08:02:05.5857762Z 2025-09-07T08:02:05.5857832Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5857902Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5857978Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858046Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858122Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858196Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858264Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858338Z cudagraph partition due to non gpu ops 2025-09-07T08:02:05.5858430Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:05.5858617Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:05.5858675Z return mod(**inputs) 2025-09-07T08:02:05.5858954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/blenderbot_small/modeling_blenderbot_small.py", line 1398, in forward 2025-09-07T08:02:05.5859112Z masked_lm_loss = loss_fct(lm_logits.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:02:05.5859115Z 2025-09-07T08:02:15.4298070Z pass 2025-09-07T08:02:15.4298488Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:17.7750683Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:02:17.7751735Z import pynvml # type: ignore[import] 2025-09-07T08:02:20.0032990Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:02:20.0033833Z from pkg_resources import resource_filename 2025-09-07T08:02:20.5811609Z 2025-09-07T08:02:21.7392986Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:02:21.7393301Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:02:21.7393558Z cpu eval CamemBert 2025-09-07T08:02:21.9645855Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:22.0441386Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:22.1215558Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:30.9605258Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9605676Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9606011Z return mod(**inputs) 2025-09-07T08:02:30.9606416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9606811Z outputs = self.roberta( 2025-09-07T08:02:30.9607220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-09-07T08:02:30.9607617Z embedding_output = self.embeddings( 2025-09-07T08:02:30.9608005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-09-07T08:02:30.9608508Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-09-07T08:02:30.9609089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1590, in create_position_ids_from_input_ids 2025-09-07T08:02:30.9609540Z mask = input_ids.ne(padding_idx).int() 2025-09-07T08:02:30.9609669Z 2025-09-07T08:02:30.9609747Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610009Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610197Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610392Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610589Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610785Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9610976Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9611168Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9611359Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9611550Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9611761Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9611943Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9612163Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9612503Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9612808Z return mod(**inputs) 2025-09-07T08:02:30.9613166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9613544Z outputs = self.roberta( 2025-09-07T08:02:30.9614236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-09-07T08:02:30.9614731Z embedding_output = self.embeddings( 2025-09-07T08:02:30.9615104Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-09-07T08:02:30.9615604Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-09-07T08:02:30.9616169Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1591, in create_position_ids_from_input_ids 2025-09-07T08:02:30.9616720Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-09-07T08:02:30.9616948Z 2025-09-07T08:02:30.9617058Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9617393Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9617706Z return mod(**inputs) 2025-09-07T08:02:30.9618068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9618444Z outputs = self.roberta( 2025-09-07T08:02:30.9618804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 886, in forward 2025-09-07T08:02:30.9619176Z embedding_output = self.embeddings( 2025-09-07T08:02:30.9619553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 90, in forward 2025-09-07T08:02:30.9620056Z position_ids = create_position_ids_from_input_ids(input_ids, self.padding_idx, past_key_values_length) 2025-09-07T08:02:30.9620608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1591, in create_position_ids_from_input_ids 2025-09-07T08:02:30.9621154Z incremental_indices = (torch.cumsum(mask, dim=1).type_as(mask) + past_key_values_length) * mask 2025-09-07T08:02:30.9621378Z 2025-09-07T08:02:30.9621459Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9621648Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9621845Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9622036Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9622232Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9622416Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9622610Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9622832Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9623163Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9623454Z return mod(**inputs) 2025-09-07T08:02:30.9623816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9624203Z outputs = self.roberta( 2025-09-07T08:02:30.9624564Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9624950Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9625319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9625695Z layer_outputs = layer_module( 2025-09-07T08:02:30.9626026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9626372Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9626745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9627132Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9628543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9628973Z return func(*args, **kwargs) 2025-09-07T08:02:30.9629347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9629719Z self_outputs = self.self( 2025-09-07T08:02:30.9630069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9630416Z return func(*args, **kwargs) 2025-09-07T08:02:30.9630784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9631218Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9631390Z 2025-09-07T08:02:30.9631461Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9631661Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9631859Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632054Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632238Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632429Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632620Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632812Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9632995Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9633186Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9633373Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9633561Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9633746Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9633968Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9634310Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9634613Z return mod(**inputs) 2025-09-07T08:02:30.9634974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9635352Z outputs = self.roberta( 2025-09-07T08:02:30.9635716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9636097Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9636472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9636842Z layer_outputs = layer_module( 2025-09-07T08:02:30.9637170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9637512Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9637897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9638288Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9638641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9638987Z return func(*args, **kwargs) 2025-09-07T08:02:30.9639350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9639727Z self_outputs = self.self( 2025-09-07T08:02:30.9640058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9640432Z return func(*args, **kwargs) 2025-09-07T08:02:30.9640804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9641228Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9641407Z 2025-09-07T08:02:30.9641553Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9641751Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9641945Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9642141Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9642325Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9642518Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9642711Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9642902Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9643084Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9643279Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9643475Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9643664Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9643847Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9644064Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9644401Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9644705Z return mod(**inputs) 2025-09-07T08:02:30.9645061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9645433Z outputs = self.roberta( 2025-09-07T08:02:30.9645788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9646161Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9646533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9646900Z layer_outputs = layer_module( 2025-09-07T08:02:30.9647223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9647557Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9647937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9648318Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9648671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9649015Z return func(*args, **kwargs) 2025-09-07T08:02:30.9649388Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9649755Z self_outputs = self.self( 2025-09-07T08:02:30.9650083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9650428Z return func(*args, **kwargs) 2025-09-07T08:02:30.9650790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9651222Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9651393Z 2025-09-07T08:02:30.9651474Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9651666Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9651858Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9652054Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9652244Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9652428Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9652617Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9652813Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9653006Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9653191Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9653385Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9653576Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9653771Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9654069Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9654407Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9654711Z return mod(**inputs) 2025-09-07T08:02:30.9655069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9655438Z outputs = self.roberta( 2025-09-07T08:02:30.9655797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9656174Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9656546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9656919Z layer_outputs = layer_module( 2025-09-07T08:02:30.9657236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9657584Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9657968Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9658356Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9658712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9659049Z return func(*args, **kwargs) 2025-09-07T08:02:30.9659414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9659791Z self_outputs = self.self( 2025-09-07T08:02:30.9660129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9660466Z return func(*args, **kwargs) 2025-09-07T08:02:30.9660834Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9661262Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9661427Z 2025-09-07T08:02:30.9661508Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9661704Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9661887Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9662077Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9662264Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9662454Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9662635Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9662823Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663015Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663208Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663391Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663587Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663780Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9663998Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9664329Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9664632Z return mod(**inputs) 2025-09-07T08:02:30.9664995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9665374Z outputs = self.roberta( 2025-09-07T08:02:30.9665735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9666102Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9666475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9666885Z layer_outputs = layer_module( 2025-09-07T08:02:30.9667241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9667573Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9667952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9668339Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9668698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9669044Z return func(*args, **kwargs) 2025-09-07T08:02:30.9669401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9669777Z self_outputs = self.self( 2025-09-07T08:02:30.9670116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9670463Z return func(*args, **kwargs) 2025-09-07T08:02:30.9670826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9671246Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9671422Z 2025-09-07T08:02:30.9671493Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9671687Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9671882Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9672131Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9672320Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9672511Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9672702Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9672882Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9673070Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9673266Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9673458Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9673643Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9673835Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9674058Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9674394Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9674698Z return mod(**inputs) 2025-09-07T08:02:30.9675052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9675425Z outputs = self.roberta( 2025-09-07T08:02:30.9675785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9676160Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9676525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9676902Z layer_outputs = layer_module( 2025-09-07T08:02:30.9677223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9677560Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9677936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9678314Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9678671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9679015Z return func(*args, **kwargs) 2025-09-07T08:02:30.9679383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9679827Z self_outputs = self.self( 2025-09-07T08:02:30.9680159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9680503Z return func(*args, **kwargs) 2025-09-07T08:02:30.9680876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9681568Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9681738Z 2025-09-07T08:02:30.9681820Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9682012Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9682215Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9682416Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9682616Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9682808Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683006Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683211Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683414Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683601Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683793Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9683989Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9684182Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9684400Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9684746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9685056Z return mod(**inputs) 2025-09-07T08:02:30.9685427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9685803Z outputs = self.roberta( 2025-09-07T08:02:30.9686181Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9686570Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9686955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9687336Z layer_outputs = layer_module( 2025-09-07T08:02:30.9687657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9688000Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9688384Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9688773Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9689130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9689481Z return func(*args, **kwargs) 2025-09-07T08:02:30.9689856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9690241Z self_outputs = self.self( 2025-09-07T08:02:30.9690583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9690927Z return func(*args, **kwargs) 2025-09-07T08:02:30.9691301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9691735Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9691906Z 2025-09-07T08:02:30.9691988Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9692187Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9692377Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9692571Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9692763Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9693076Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9693266Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9693462Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9693664Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9693874Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9694074Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9694268Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9694466Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9694684Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9695011Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9695312Z return mod(**inputs) 2025-09-07T08:02:30.9695669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9696047Z outputs = self.roberta( 2025-09-07T08:02:30.9696409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9696779Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9697150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9697521Z layer_outputs = layer_module( 2025-09-07T08:02:30.9697844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9698167Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9698540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9698922Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9699279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9699628Z return func(*args, **kwargs) 2025-09-07T08:02:30.9699985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9700353Z self_outputs = self.self( 2025-09-07T08:02:30.9700684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9701025Z return func(*args, **kwargs) 2025-09-07T08:02:30.9701376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9701799Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9701971Z 2025-09-07T08:02:30.9702044Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9702238Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9702432Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9702621Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9702811Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9702999Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9703191Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9703374Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9703564Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9703755Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9703947Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9704132Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9704324Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9704542Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9704875Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9705167Z return mod(**inputs) 2025-09-07T08:02:30.9705582Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9705996Z outputs = self.roberta( 2025-09-07T08:02:30.9706368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9706750Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9707119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9707504Z layer_outputs = layer_module( 2025-09-07T08:02:30.9707836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9708180Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9708566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9708952Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9709316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9709666Z return func(*args, **kwargs) 2025-09-07T08:02:30.9710035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9710405Z self_outputs = self.self( 2025-09-07T08:02:30.9710743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9711092Z return func(*args, **kwargs) 2025-09-07T08:02:30.9711462Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9711893Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9712063Z 2025-09-07T08:02:30.9712140Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9712344Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9712542Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9712739Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9712925Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9713118Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9713316Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9713511Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9713700Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9713894Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9714085Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9714278Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9714466Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9714692Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9715037Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9715347Z return mod(**inputs) 2025-09-07T08:02:30.9715711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9716079Z outputs = self.roberta( 2025-09-07T08:02:30.9716441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9716822Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9717199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9717573Z layer_outputs = layer_module( 2025-09-07T08:02:30.9717906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9718252Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9718666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9719092Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9719439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9719785Z return func(*args, **kwargs) 2025-09-07T08:02:30.9720153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9720531Z self_outputs = self.self( 2025-09-07T08:02:30.9720865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9721201Z return func(*args, **kwargs) 2025-09-07T08:02:30.9721566Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9721995Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9722170Z 2025-09-07T08:02:30.9722253Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9722443Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9722637Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9722829Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723023Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723207Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723400Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723592Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723780Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9723973Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9724158Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9724347Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9724536Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9724759Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9725093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9725391Z return mod(**inputs) 2025-09-07T08:02:30.9725755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9726131Z outputs = self.roberta( 2025-09-07T08:02:30.9726490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9726868Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9727240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9727610Z layer_outputs = layer_module( 2025-09-07T08:02:30.9727936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9728267Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9728647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9729029Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9729385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9729726Z return func(*args, **kwargs) 2025-09-07T08:02:30.9730089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9730461Z self_outputs = self.self( 2025-09-07T08:02:30.9730798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9731141Z return func(*args, **kwargs) 2025-09-07T08:02:30.9731535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9731992Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9732166Z 2025-09-07T08:02:30.9732238Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9732435Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9732631Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9732815Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733005Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733198Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733388Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733572Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733768Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9733960Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9734150Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9734337Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9734533Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9734756Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9735093Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9735385Z return mod(**inputs) 2025-09-07T08:02:30.9735749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1038, in forward 2025-09-07T08:02:30.9736121Z outputs = self.roberta( 2025-09-07T08:02:30.9736480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 950, in forward 2025-09-07T08:02:30.9736850Z encoder_outputs = self.encoder( 2025-09-07T08:02:30.9737212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 632, in forward 2025-09-07T08:02:30.9737580Z layer_outputs = layer_module( 2025-09-07T08:02:30.9737911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:02:30.9738246Z return super().__call__(*args, **kwargs) 2025-09-07T08:02:30.9738619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 540, in forward 2025-09-07T08:02:30.9739005Z self_attention_outputs = self.attention( 2025-09-07T08:02:30.9739363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9739709Z return func(*args, **kwargs) 2025-09-07T08:02:30.9740072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 467, in forward 2025-09-07T08:02:30.9740435Z self_outputs = self.self( 2025-09-07T08:02:30.9740772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func 2025-09-07T08:02:30.9741116Z return func(*args, **kwargs) 2025-09-07T08:02:30.9741483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 389, in forward 2025-09-07T08:02:30.9741901Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:02:30.9742068Z 2025-09-07T08:02:30.9742141Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9742336Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9742527Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9742716Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9742899Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9743089Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9743278Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9743469Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9743652Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9743872Z cudagraph partition due to non gpu ops 2025-09-07T08:02:30.9744123Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:02:30.9744460Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:02:30.9744756Z return mod(**inputs) 2025-09-07T08:02:30.9745119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/camembert/modeling_camembert.py", line 1059, in forward 2025-09-07T08:02:30.9745603Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:02:30.9745821Z 2025-09-07T08:02:39.8619615Z pass 2025-09-07T08:02:39.8620109Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:42.1840873Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:02:42.1841777Z import pynvml # type: ignore[import] 2025-09-07T08:02:44.4124444Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:02:44.4125268Z from pkg_resources import resource_filename 2025-09-07T08:02:44.9470811Z 2025-09-07T08:02:52.8638430Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:02:52.8638686Z loading model: 0it [00:07, ?it/s] 2025-09-07T08:02:52.8638898Z cpu eval DebertaV2ForMaskedLM 2025-09-07T08:02:53.0865188Z pass_due_to_skip 2025-09-07T08:02:53.0865550Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:02:54.7462697Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:02:54.7463472Z import pynvml # type: ignore[import] 2025-09-07T08:02:56.9771731Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:02:56.9772591Z from pkg_resources import resource_filename 2025-09-07T08:02:57.5593826Z 2025-09-07T08:03:04.3445686Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:03:04.3449653Z loading model: 0it [00:06, ?it/s] 2025-09-07T08:03:04.3451551Z cpu eval DebertaV2ForQuestionAnswering 2025-09-07T08:03:05.5171443Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:05.9203631Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:06.3908250Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:22.4197950Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4198391Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4198703Z return mod(**inputs) 2025-09-07T08:03:22.4199110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1244, in forward 2025-09-07T08:03:22.4199533Z logits = self.qa_outputs(sequence_output) 2025-09-07T08:03:22.4199670Z 2025-09-07T08:03:22.4199757Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4199951Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4200668Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4200864Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4201096Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4201464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4201765Z return mod(**inputs) 2025-09-07T08:03:22.4202138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4202524Z outputs = self.deberta( 2025-09-07T08:03:22.4202897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4203270Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4203660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4204069Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4204439Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4204785Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4205171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4205580Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4205987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4206372Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4206761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4207255Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4207794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4208276Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4208453Z 2025-09-07T08:03:22.4208569Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4208916Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4209221Z return mod(**inputs) 2025-09-07T08:03:22.4209590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4209979Z outputs = self.deberta( 2025-09-07T08:03:22.4210348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4210737Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4211111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4211515Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4211867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4212214Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4212597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4213000Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4213553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4213941Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4214386Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4214985Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4215254Z 2025-09-07T08:03:22.4215357Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4215712Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4216029Z return mod(**inputs) 2025-09-07T08:03:22.4216396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4216777Z outputs = self.deberta( 2025-09-07T08:03:22.4217143Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4217538Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4217922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4218332Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4218678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4219038Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4219426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4219830Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4220230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4220607Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4220994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4221515Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4221763Z 2025-09-07T08:03:22.4221846Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4222052Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4222272Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4222616Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4222934Z return mod(**inputs) 2025-09-07T08:03:22.4223302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4223689Z outputs = self.deberta( 2025-09-07T08:03:22.4224057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4224455Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4224839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4225238Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4225594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4225936Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4226322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4226724Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4227115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4227505Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4227928Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4228454Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4228993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4229469Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4229645Z 2025-09-07T08:03:22.4229746Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4230087Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4230397Z return mod(**inputs) 2025-09-07T08:03:22.4230767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4231148Z outputs = self.deberta( 2025-09-07T08:03:22.4231523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4231913Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4232293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4232695Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4233038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4233385Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4233780Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4234190Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4234595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4234975Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4235364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4235748Z context_layer = torch.bmm( 2025-09-07T08:03:22.4235860Z 2025-09-07T08:03:22.4235966Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4236308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4236611Z return mod(**inputs) 2025-09-07T08:03:22.4236972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4237356Z outputs = self.deberta( 2025-09-07T08:03:22.4237730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4238113Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4238492Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4238890Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4239246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4239587Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4239964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4240363Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4240765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4241182Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4241663Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4242163Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4242393Z 2025-09-07T08:03:22.4242470Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4242672Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4242871Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4243063Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4243248Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4243444Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4243637Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4243838Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4244021Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4244219Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4244445Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4244787Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4245097Z return mod(**inputs) 2025-09-07T08:03:22.4245460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4245841Z outputs = self.deberta( 2025-09-07T08:03:22.4246206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4246590Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4246956Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4247360Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4247707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4248050Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4248431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4248825Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4249249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4249630Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4250013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4250513Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4251033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4251508Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4251691Z 2025-09-07T08:03:22.4251791Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4252136Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4252442Z return mod(**inputs) 2025-09-07T08:03:22.4252799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4253181Z outputs = self.deberta( 2025-09-07T08:03:22.4253549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4253936Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4254352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4254774Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4255121Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4255469Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4255855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4256255Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4256647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4257030Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4257425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4257950Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4258199Z 2025-09-07T08:03:22.4258305Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4258643Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4258951Z return mod(**inputs) 2025-09-07T08:03:22.4259317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4259699Z outputs = self.deberta( 2025-09-07T08:03:22.4260061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4260440Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4260823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4261221Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4261572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4261915Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4262304Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4262705Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4263108Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4263496Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4263877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4264390Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4264648Z 2025-09-07T08:03:22.4264723Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4264940Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4265153Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4265489Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4265786Z return mod(**inputs) 2025-09-07T08:03:22.4266145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4266516Z outputs = self.deberta( 2025-09-07T08:03:22.4266865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4267240Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4267644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4269312Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4269656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4269984Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4270373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4270767Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4271167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4271551Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4271922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4272407Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4272923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4273382Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4273552Z 2025-09-07T08:03:22.4273660Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4273986Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4274287Z return mod(**inputs) 2025-09-07T08:03:22.4274643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4275016Z outputs = self.deberta( 2025-09-07T08:03:22.4275365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4275742Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4276109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4276494Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4276829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4277155Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4277528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4277918Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4278315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4278695Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4279061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4279433Z context_layer = torch.bmm( 2025-09-07T08:03:22.4279550Z 2025-09-07T08:03:22.4279647Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4279977Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4280266Z return mod(**inputs) 2025-09-07T08:03:22.4280618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4281045Z outputs = self.deberta( 2025-09-07T08:03:22.4281401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4281844Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4282265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4282662Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4283012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4283356Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4283744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4284135Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4284536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4284919Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4285307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4285803Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4286031Z 2025-09-07T08:03:22.4286111Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4286315Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4286513Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4286716Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4286906Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4287103Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4287300Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4287494Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4287684Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4287884Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4288109Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4288455Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4288752Z return mod(**inputs) 2025-09-07T08:03:22.4289119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4289502Z outputs = self.deberta( 2025-09-07T08:03:22.4289869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4290251Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4290619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4291010Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4291358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4291705Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4292090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4292480Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4292879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4293263Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4293643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4294133Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4294699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4295205Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4295389Z 2025-09-07T08:03:22.4295491Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4295834Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4296144Z return mod(**inputs) 2025-09-07T08:03:22.4296507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4296888Z outputs = self.deberta( 2025-09-07T08:03:22.4297254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4297635Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4298005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4298412Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4298765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4299113Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4299500Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4299898Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4300305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4300694Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4301083Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4301602Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4301854Z 2025-09-07T08:03:22.4301955Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4302302Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4302631Z return mod(**inputs) 2025-09-07T08:03:22.4303000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4303377Z outputs = self.deberta( 2025-09-07T08:03:22.4303742Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4304128Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4304507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4304906Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4305249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4305595Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4305981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4306380Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4306782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4307162Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4307549Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4308058Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4308364Z 2025-09-07T08:03:22.4308447Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4308646Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4308862Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4309196Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4309497Z return mod(**inputs) 2025-09-07T08:03:22.4309853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4310222Z outputs = self.deberta( 2025-09-07T08:03:22.4310581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4310954Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4311324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4311708Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4312043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4312380Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4312754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4313149Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4313543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4313914Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4314291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4314775Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4315294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4315754Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4315923Z 2025-09-07T08:03:22.4316021Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4316360Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4316661Z return mod(**inputs) 2025-09-07T08:03:22.4317018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4317380Z outputs = self.deberta( 2025-09-07T08:03:22.4317739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4318114Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4318484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4318869Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4319202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4319540Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4319913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4320303Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4320694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4321061Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4321498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4321866Z context_layer = torch.bmm( 2025-09-07T08:03:22.4321976Z 2025-09-07T08:03:22.4322081Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4322416Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4322709Z return mod(**inputs) 2025-09-07T08:03:22.4323065Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4323435Z outputs = self.deberta( 2025-09-07T08:03:22.4323794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4324157Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4324525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4324912Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4325251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4325585Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4325952Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4326343Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4326731Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4327107Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4327486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4327961Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4328194Z 2025-09-07T08:03:22.4328267Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4328471Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4328666Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4328849Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329040Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329228Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329416Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329598Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329789Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4329978Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4330191Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4330520Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4330825Z return mod(**inputs) 2025-09-07T08:03:22.4331183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4331554Z outputs = self.deberta( 2025-09-07T08:03:22.4331913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4332276Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4332645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4333033Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4333369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4333710Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4334148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4334541Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4334931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4335309Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4335676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4336154Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4336666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4337125Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4337302Z 2025-09-07T08:03:22.4337408Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4337746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4338042Z return mod(**inputs) 2025-09-07T08:03:22.4338402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4338772Z outputs = self.deberta( 2025-09-07T08:03:22.4339129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4339493Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4339863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4340246Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4340590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4340925Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4341294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4341688Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4342076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4342450Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4342820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4343312Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4343565Z 2025-09-07T08:03:22.4343665Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4344000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4344303Z return mod(**inputs) 2025-09-07T08:03:22.4344660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4345026Z outputs = self.deberta( 2025-09-07T08:03:22.4345383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4345752Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4346118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4346493Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4346883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4347248Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4347625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4348016Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4348397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4348773Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4349152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4349651Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4349894Z 2025-09-07T08:03:22.4349977Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4350173Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4350395Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4350731Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4351037Z return mod(**inputs) 2025-09-07T08:03:22.4351387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4351763Z outputs = self.deberta( 2025-09-07T08:03:22.4352120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4352491Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4352862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4353241Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4353583Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4353918Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4354293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4354681Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4355059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4355436Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4355808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4356288Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4356802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4357261Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4357438Z 2025-09-07T08:03:22.4357535Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4357870Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4358175Z return mod(**inputs) 2025-09-07T08:03:22.4358531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4358897Z outputs = self.deberta( 2025-09-07T08:03:22.4359254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4359624Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4360067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4360446Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4360788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4361122Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4361502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4361893Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4362275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4362646Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4363020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4363396Z context_layer = torch.bmm( 2025-09-07T08:03:22.4363504Z 2025-09-07T08:03:22.4363609Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4363940Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4364243Z return mod(**inputs) 2025-09-07T08:03:22.4364599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4364971Z outputs = self.deberta( 2025-09-07T08:03:22.4365319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4365691Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4366061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4366448Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4366785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4367110Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4367490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4367894Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4368283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4368658Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4369025Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4369506Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4369736Z 2025-09-07T08:03:22.4369813Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370015Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370201Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370395Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370588Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370783Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4370977Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4371161Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4371351Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4371543Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4371760Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4372086Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4372386Z return mod(**inputs) 2025-09-07T08:03:22.4372810Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4373186Z outputs = self.deberta( 2025-09-07T08:03:22.4373533Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4373903Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4374269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4374651Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4374990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4375314Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4375698Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4376088Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4376478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4376848Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4377213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4377696Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4378202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4378659Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4378832Z 2025-09-07T08:03:22.4378940Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4379269Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4379568Z return mod(**inputs) 2025-09-07T08:03:22.4379922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4380294Z outputs = self.deberta( 2025-09-07T08:03:22.4380643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4381071Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4381443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4381831Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4382176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4382512Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4382895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4383293Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4383694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4384078Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4384445Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4384955Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4385211Z 2025-09-07T08:03:22.4385311Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4385770Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4386073Z return mod(**inputs) 2025-09-07T08:03:22.4386424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4386803Z outputs = self.deberta( 2025-09-07T08:03:22.4387160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4387532Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4387898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4388287Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4388626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4388967Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4389344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4389726Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4390114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4390493Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4390865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4391364Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4391606Z 2025-09-07T08:03:22.4391678Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4391878Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4392100Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4392435Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4392739Z return mod(**inputs) 2025-09-07T08:03:22.4393093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4393464Z outputs = self.deberta( 2025-09-07T08:03:22.4393817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4394188Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4394548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4394932Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4395271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4395609Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4395985Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4396371Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4396763Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4397142Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4397518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4398002Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4398590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4399095Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4399273Z 2025-09-07T08:03:22.4399371Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4399710Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4400014Z return mod(**inputs) 2025-09-07T08:03:22.4400367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4400749Z outputs = self.deberta( 2025-09-07T08:03:22.4401110Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4401491Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4401862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4402257Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4402603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4402943Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4403324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4403707Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4404103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4404481Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4404860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4405236Z context_layer = torch.bmm( 2025-09-07T08:03:22.4405347Z 2025-09-07T08:03:22.4405446Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4405779Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4406080Z return mod(**inputs) 2025-09-07T08:03:22.4406436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4406806Z outputs = self.deberta( 2025-09-07T08:03:22.4407165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4407539Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4407910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4408295Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4408631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4408970Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4409348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4409741Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4410138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4410506Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4410883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4411361Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4411584Z 2025-09-07T08:03:22.4411726Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4411925Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4412116Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4412315Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4412512Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4412706Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4412892Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4413087Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4413286Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4413482Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4413693Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4414033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4414344Z return mod(**inputs) 2025-09-07T08:03:22.4414716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4415088Z outputs = self.deberta( 2025-09-07T08:03:22.4415450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4415822Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4416199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4416590Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4416925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4417264Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4417638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4418034Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4418427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4418797Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4419178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4419658Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4420176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4420644Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4420817Z 2025-09-07T08:03:22.4420916Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4421258Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4421563Z return mod(**inputs) 2025-09-07T08:03:22.4421922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4422302Z outputs = self.deberta( 2025-09-07T08:03:22.4422653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4423025Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4423397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4423787Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4424119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4424486Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4424899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4425294Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4425683Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4426056Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4426434Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4426933Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4427177Z 2025-09-07T08:03:22.4427281Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4427618Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4427917Z return mod(**inputs) 2025-09-07T08:03:22.4428273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4428649Z outputs = self.deberta( 2025-09-07T08:03:22.4429007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4429385Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4429750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4430137Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4430483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4430818Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4431193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4431592Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4431983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4432363Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4432736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4433224Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4433474Z 2025-09-07T08:03:22.4433548Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4433750Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4433969Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4434304Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4434597Z return mod(**inputs) 2025-09-07T08:03:22.4434954Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4435328Z outputs = self.deberta( 2025-09-07T08:03:22.4435684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4436049Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4436426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4436817Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4437158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4437520Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4437941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4438333Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4438724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4439099Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4439466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4439945Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4440458Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4440917Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4441088Z 2025-09-07T08:03:22.4441195Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4441530Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4441823Z return mod(**inputs) 2025-09-07T08:03:22.4442187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4442560Z outputs = self.deberta( 2025-09-07T08:03:22.4442919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4443288Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4443649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4444035Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4444379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4444715Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4445085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4445478Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4445870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4446249Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4446628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4446994Z context_layer = torch.bmm( 2025-09-07T08:03:22.4447115Z 2025-09-07T08:03:22.4447217Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4447556Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4447860Z return mod(**inputs) 2025-09-07T08:03:22.4448221Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4448587Z outputs = self.deberta( 2025-09-07T08:03:22.4448947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4449320Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4449690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4450073Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4450444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4450807Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4451189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4451579Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4451959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4452333Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4452709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4453187Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4453409Z 2025-09-07T08:03:22.4453492Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4453688Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4453886Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4454080Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4454272Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4454453Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4454642Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4454832Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4455021Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4455201Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4455422Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4455753Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4456051Z return mod(**inputs) 2025-09-07T08:03:22.4456402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4456771Z outputs = self.deberta( 2025-09-07T08:03:22.4457128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4457500Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4457868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4458247Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4458587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4458917Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4459291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4459684Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4460067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4460446Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4460820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4461296Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4461807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4462261Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4462438Z 2025-09-07T08:03:22.4462536Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4462868Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4463204Z return mod(**inputs) 2025-09-07T08:03:22.4463604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4463973Z outputs = self.deberta( 2025-09-07T08:03:22.4464334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4464706Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4465073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4465452Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4465792Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4466128Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4466513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4466910Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4467295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4467674Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4468046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4468557Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4468806Z 2025-09-07T08:03:22.4468915Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4469243Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4469545Z return mod(**inputs) 2025-09-07T08:03:22.4469903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4470281Z outputs = self.deberta( 2025-09-07T08:03:22.4470643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4471013Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4471379Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4471764Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4472106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4472430Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4472816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4473212Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4473608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4473982Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4474350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4474850Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4475101Z 2025-09-07T08:03:22.4475174Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4475371Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4475590Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4475917Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4476301Z return mod(**inputs) 2025-09-07T08:03:22.4476666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4477044Z outputs = self.deberta( 2025-09-07T08:03:22.4477400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4477778Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4478155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4478550Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4478897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4479230Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4479618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4480022Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4480416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4480797Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4481209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4481697Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4482216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4482684Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4482854Z 2025-09-07T08:03:22.4482972Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4483308Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4483611Z return mod(**inputs) 2025-09-07T08:03:22.4483971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4484352Z outputs = self.deberta( 2025-09-07T08:03:22.4484711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4485082Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4485455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4485843Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4486182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4486513Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4486889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4487279Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4487670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4488044Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4488410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4488781Z context_layer = torch.bmm( 2025-09-07T08:03:22.4488896Z 2025-09-07T08:03:22.4488995Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4489400Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4489753Z return mod(**inputs) 2025-09-07T08:03:22.4490103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4490477Z outputs = self.deberta( 2025-09-07T08:03:22.4490835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4491207Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4491571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4491958Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4492298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4492635Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4493015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4493401Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4493797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4494173Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4494550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4495029Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4495252Z 2025-09-07T08:03:22.4495326Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4495528Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4495724Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4495923Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4496111Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4496305Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4496497Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4497104Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4497301Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4497488Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4497710Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4498047Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4498349Z return mod(**inputs) 2025-09-07T08:03:22.4498696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4499071Z outputs = self.deberta( 2025-09-07T08:03:22.4499427Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4499804Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4500172Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4500548Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4500891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4501223Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4501598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4501982Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4502403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4502811Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4503185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4503667Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4504170Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4504628Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4504804Z 2025-09-07T08:03:22.4504899Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4505235Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4505537Z return mod(**inputs) 2025-09-07T08:03:22.4505889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4506263Z outputs = self.deberta( 2025-09-07T08:03:22.4506619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4506990Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4507358Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4507737Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4508077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4508411Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4508786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4509184Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4509568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4509945Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4510317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4510817Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4511062Z 2025-09-07T08:03:22.4511163Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4511490Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4511794Z return mod(**inputs) 2025-09-07T08:03:22.4512156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4512535Z outputs = self.deberta( 2025-09-07T08:03:22.4512884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4513262Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4513633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4514021Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4514359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4514685Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4515064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4515483Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4515905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4516282Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4516648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4517147Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4517398Z 2025-09-07T08:03:22.4517472Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4517671Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4517887Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4518218Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4518517Z return mod(**inputs) 2025-09-07T08:03:22.4518878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4519253Z outputs = self.deberta( 2025-09-07T08:03:22.4519600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4519973Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4520340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4520726Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4521067Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4521391Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4521770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4522163Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4522554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4522923Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4523294Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4523773Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4524287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4524751Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4524921Z 2025-09-07T08:03:22.4525027Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4525359Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4525662Z return mod(**inputs) 2025-09-07T08:03:22.4526021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4526397Z outputs = self.deberta( 2025-09-07T08:03:22.4526745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4527120Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4527491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4527879Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4528219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4528610Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4529002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4529401Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4529797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4530178Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4530551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4530921Z context_layer = torch.bmm( 2025-09-07T08:03:22.4531039Z 2025-09-07T08:03:22.4531142Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4531481Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4531780Z return mod(**inputs) 2025-09-07T08:03:22.4532144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4532518Z outputs = self.deberta( 2025-09-07T08:03:22.4532879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4533252Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4533616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4534004Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4534348Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4534686Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4535068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4535459Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4535850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4536228Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4536607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4537081Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4537315Z 2025-09-07T08:03:22.4537390Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4537593Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4537789Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4537981Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4538164Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4538358Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4538551Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4538744Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4538926Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4539118Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4539340Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4539681Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4539974Z return mod(**inputs) 2025-09-07T08:03:22.4540333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4540711Z outputs = self.deberta( 2025-09-07T08:03:22.4541113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4541515Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4541879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4542273Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4542616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4542957Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4543324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4543722Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4544114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4544491Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4544875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4545347Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4545862Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4546325Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4546502Z 2025-09-07T08:03:22.4546601Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4546939Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4547234Z return mod(**inputs) 2025-09-07T08:03:22.4547595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4547971Z outputs = self.deberta( 2025-09-07T08:03:22.4548329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4548706Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4549066Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4549455Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4549794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4550127Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4550502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4550882Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4551271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4551649Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4552020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4552519Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4552765Z 2025-09-07T08:03:22.4552861Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4553194Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4553495Z return mod(**inputs) 2025-09-07T08:03:22.4553850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4554217Z outputs = self.deberta( 2025-09-07T08:03:22.4554635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4555006Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4555377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4555766Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4556095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4556424Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4556804Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4557203Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4557593Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4557967Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4558344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4558850Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4559094Z 2025-09-07T08:03:22.4559177Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4559377Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4559590Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4559928Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4560231Z return mod(**inputs) 2025-09-07T08:03:22.4560594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4560973Z outputs = self.deberta( 2025-09-07T08:03:22.4561334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4561710Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4562080Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4562468Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4562800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4563135Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4563514Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4563910Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4564298Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4564681Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4565060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4565544Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4565833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4565965Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4565968Z 2025-09-07T08:03:22.4566067Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4566295Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4566385Z return mod(**inputs) 2025-09-07T08:03:22.4566657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4566730Z outputs = self.deberta( 2025-09-07T08:03:22.4566988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4567066Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4567320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4567399Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4567619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4567693Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4567960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4568051Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4568312Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4568383Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4568643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4568718Z context_layer = torch.bmm( 2025-09-07T08:03:22.4568721Z 2025-09-07T08:03:22.4568819Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4569013Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4569074Z return mod(**inputs) 2025-09-07T08:03:22.4569334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4569411Z outputs = self.deberta( 2025-09-07T08:03:22.4569666Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4569739Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4569995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4570083Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4570291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4570364Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4570627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4570720Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4570982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4571052Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4571303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4571491Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4571494Z 2025-09-07T08:03:22.4571568Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4571648Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4571719Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4571797Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4571866Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4571963Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4572082Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4572152Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4572221Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4572298Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4572395Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4572590Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4572649Z return mod(**inputs) 2025-09-07T08:03:22.4572909Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4572981Z outputs = self.deberta( 2025-09-07T08:03:22.4573238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4573313Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4573568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4573660Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4573868Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4573942Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4574202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4574288Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4574548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4574618Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4574872Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4575060Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4575352Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4575485Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4575488Z 2025-09-07T08:03:22.4575584Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4575781Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4575842Z return mod(**inputs) 2025-09-07T08:03:22.4576102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4576175Z outputs = self.deberta( 2025-09-07T08:03:22.4576429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4576506Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4576759Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4576844Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4577050Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4577123Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4577382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4577467Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4577755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4577855Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4578100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4578306Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4578309Z 2025-09-07T08:03:22.4578404Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4578595Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4578655Z return mod(**inputs) 2025-09-07T08:03:22.4578916Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4578981Z outputs = self.deberta( 2025-09-07T08:03:22.4579233Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4579309Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4579556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4579644Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4579844Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4579918Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4580171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4580257Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4580510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4580586Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4580839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4581072Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4581075Z 2025-09-07T08:03:22.4581148Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4581227Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4581323Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4581512Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4581571Z return mod(**inputs) 2025-09-07T08:03:22.4581825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4581897Z outputs = self.deberta( 2025-09-07T08:03:22.4582154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4582230Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4582479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4582557Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4582767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4582839Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4583100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4583186Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4583499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4583619Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4583866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4584052Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4584334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4584464Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4584467Z 2025-09-07T08:03:22.4584565Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4584756Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4584817Z return mod(**inputs) 2025-09-07T08:03:22.4585073Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4585148Z outputs = self.deberta( 2025-09-07T08:03:22.4585397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4585469Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4585717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4585798Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4586018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4586092Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4586350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4586438Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4586695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4586767Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4587012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4587088Z context_layer = torch.bmm( 2025-09-07T08:03:22.4587091Z 2025-09-07T08:03:22.4587187Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4587377Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4587439Z return mod(**inputs) 2025-09-07T08:03:22.4587689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4587765Z outputs = self.deberta( 2025-09-07T08:03:22.4588012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4588087Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4588335Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4588414Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4588625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4588698Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4588951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4589038Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4589319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4589419Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4589668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4589849Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4589853Z 2025-09-07T08:03:22.4589925Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590004Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590075Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590144Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590223Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590291Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590367Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590443Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590510Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590590Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4590684Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4590872Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4590931Z return mod(**inputs) 2025-09-07T08:03:22.4591182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4591253Z outputs = self.deberta( 2025-09-07T08:03:22.4591499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4591573Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4591822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4591902Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4592113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4592185Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4592438Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4592525Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4592778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4592847Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4593090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4593276Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4593561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4593687Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4593690Z 2025-09-07T08:03:22.4593786Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4593975Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4594035Z return mod(**inputs) 2025-09-07T08:03:22.4594286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4594357Z outputs = self.deberta( 2025-09-07T08:03:22.4594635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4594751Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4594997Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4595074Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4595286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4595359Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4595613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4595699Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4595953Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4596026Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4596273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4596471Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4596475Z 2025-09-07T08:03:22.4596569Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4596762Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4596821Z return mod(**inputs) 2025-09-07T08:03:22.4597072Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4597143Z outputs = self.deberta( 2025-09-07T08:03:22.4597387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4597459Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4597711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4597798Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4597998Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4598071Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4598324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4598410Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4598664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4598734Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4598982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4599183Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4599186Z 2025-09-07T08:03:22.4599259Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4599338Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4599431Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4599623Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4599682Z return mod(**inputs) 2025-09-07T08:03:22.4599931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4600003Z outputs = self.deberta( 2025-09-07T08:03:22.4600275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4600380Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4600626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4600703Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4600912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4600986Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4601244Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4601329Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4601575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4601656Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4601904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4602090Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4602373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4602500Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4602503Z 2025-09-07T08:03:22.4602598Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4602782Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4602851Z return mod(**inputs) 2025-09-07T08:03:22.4603103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4603179Z outputs = self.deberta( 2025-09-07T08:03:22.4603428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4603503Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4603749Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4603830Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4604040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4604114Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4604368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4604459Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4604709Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4604787Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4605039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4605113Z context_layer = torch.bmm( 2025-09-07T08:03:22.4605116Z 2025-09-07T08:03:22.4605211Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4605400Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4605462Z return mod(**inputs) 2025-09-07T08:03:22.4605710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4605779Z outputs = self.deberta( 2025-09-07T08:03:22.4606054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4606155Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4606405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4606482Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4606694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4606766Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4607020Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4607104Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4607353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4607433Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4607680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4607860Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4607863Z 2025-09-07T08:03:22.4607935Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608015Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608085Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608154Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608232Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608301Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608378Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608448Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608519Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608600Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4608697Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4608885Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4608945Z return mod(**inputs) 2025-09-07T08:03:22.4609196Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4609269Z outputs = self.deberta( 2025-09-07T08:03:22.4609520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4609592Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4609837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4609919Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4610133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4610207Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4610463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4610547Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4610791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4610870Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4611115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4611296Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4611636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4611764Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4611767Z 2025-09-07T08:03:22.4611861Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4612042Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4612111Z return mod(**inputs) 2025-09-07T08:03:22.4612363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4612431Z outputs = self.deberta( 2025-09-07T08:03:22.4612682Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4612761Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4613012Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4613092Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4613300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4613373Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4613627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4613711Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4613959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4614038Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4614290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4614493Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4614497Z 2025-09-07T08:03:22.4614594Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4614787Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4614849Z return mod(**inputs) 2025-09-07T08:03:22.4615098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4615170Z outputs = self.deberta( 2025-09-07T08:03:22.4615417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4615492Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4615741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4615822Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4616032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4616107Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4616360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4616445Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4616701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4616771Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4617064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4617313Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4617316Z 2025-09-07T08:03:22.4617400Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4617479Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4617575Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4617757Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4617824Z return mod(**inputs) 2025-09-07T08:03:22.4618075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4618145Z outputs = self.deberta( 2025-09-07T08:03:22.4618393Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4618470Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4618721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4618800Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4619013Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4619086Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4619343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4619429Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4619675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4619755Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4620004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4620191Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4620474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4620602Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4620605Z 2025-09-07T08:03:22.4620701Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4620883Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4620954Z return mod(**inputs) 2025-09-07T08:03:22.4621206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4621277Z outputs = self.deberta( 2025-09-07T08:03:22.4621529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4621597Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4621851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4621931Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4622145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4622218Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4622474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4622560Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4622833Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4622939Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4623190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4623265Z context_layer = torch.bmm( 2025-09-07T08:03:22.4623268Z 2025-09-07T08:03:22.4623366Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4623554Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4623625Z return mod(**inputs) 2025-09-07T08:03:22.4623884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4623958Z outputs = self.deberta( 2025-09-07T08:03:22.4624211Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4624291Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4624545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4624626Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4624838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4624913Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4625171Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4625259Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4625509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4625593Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4625850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4626032Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4626035Z 2025-09-07T08:03:22.4626112Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626194Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626265Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626337Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626417Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626489Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626558Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626638Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626710Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626788Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4626893Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4627080Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4627152Z return mod(**inputs) 2025-09-07T08:03:22.4627413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4627487Z outputs = self.deberta( 2025-09-07T08:03:22.4627740Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4627817Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4628069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4628153Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4628396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4628496Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4628747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4628832Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4629076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4629154Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4629399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4629578Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4629861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4629993Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4629996Z 2025-09-07T08:03:22.4630091Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4630271Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4630341Z return mod(**inputs) 2025-09-07T08:03:22.4630599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4630674Z outputs = self.deberta( 2025-09-07T08:03:22.4630922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4630989Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4631245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4631326Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4631537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4631611Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4631865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4631950Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4632194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4632275Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4632522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4632727Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4632730Z 2025-09-07T08:03:22.4632828Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4633018Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4633079Z return mod(**inputs) 2025-09-07T08:03:22.4633331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4633404Z outputs = self.deberta( 2025-09-07T08:03:22.4633648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4633720Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4634001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4634108Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4634319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4634394Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4634651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4634735Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4634981Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4635062Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4635306Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4635505Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4635510Z 2025-09-07T08:03:22.4635581Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4635658Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4635752Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4635933Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4636002Z return mod(**inputs) 2025-09-07T08:03:22.4636251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4636324Z outputs = self.deberta( 2025-09-07T08:03:22.4636571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4636636Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4636895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4636977Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4637187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4637258Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4637510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4637594Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4637838Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4637915Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4638162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4638349Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4638631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4638750Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4638762Z 2025-09-07T08:03:22.4638856Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4639039Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4639109Z return mod(**inputs) 2025-09-07T08:03:22.4639360Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4639430Z outputs = self.deberta( 2025-09-07T08:03:22.4639718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4639813Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4640074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4640154Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4640365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4640437Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4640687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4640780Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4641029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4641111Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4641361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4641432Z context_layer = torch.bmm( 2025-09-07T08:03:22.4641435Z 2025-09-07T08:03:22.4641529Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4641712Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4641780Z return mod(**inputs) 2025-09-07T08:03:22.4642030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4642102Z outputs = self.deberta( 2025-09-07T08:03:22.4642350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4642418Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4642685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4642764Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4642977Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4643047Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4670495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4670712Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4671042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4671126Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4671411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4671609Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4671615Z 2025-09-07T08:03:22.4671700Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4671789Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4671862Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4671933Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672011Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672080Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672156Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672227Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672298Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672377Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4672485Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4672849Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4672918Z return mod(**inputs) 2025-09-07T08:03:22.4673188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4673268Z outputs = self.deberta( 2025-09-07T08:03:22.4673529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4673611Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4673866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4673957Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4674179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4674265Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4674530Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4674625Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4674885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4674962Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4675218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4675412Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4675701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4675848Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4675853Z 2025-09-07T08:03:22.4675960Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4676162Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4676227Z return mod(**inputs) 2025-09-07T08:03:22.4676484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4676560Z outputs = self.deberta( 2025-09-07T08:03:22.4676812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4676889Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4677148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4677243Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4677455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4677538Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4677787Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4677878Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4678137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4678209Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4678468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4679349Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4679385Z 2025-09-07T08:03:22.4679495Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4679686Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4679749Z return mod(**inputs) 2025-09-07T08:03:22.4680010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4680075Z outputs = self.deberta( 2025-09-07T08:03:22.4680330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4680398Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4680646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4680737Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4680991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4681075Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4681323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4681422Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4681668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4681743Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4682000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4682196Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4682202Z 2025-09-07T08:03:22.4682290Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4682363Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4682461Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4682656Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4682719Z return mod(**inputs) 2025-09-07T08:03:22.4682982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4683048Z outputs = self.deberta( 2025-09-07T08:03:22.4683308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4683378Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4683628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4683723Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4683927Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4684012Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4684262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4684350Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4684606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4684677Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4684936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4685187Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4685545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4685670Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4685674Z 2025-09-07T08:03:22.4685773Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4685970Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4686031Z return mod(**inputs) 2025-09-07T08:03:22.4686291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4686356Z outputs = self.deberta( 2025-09-07T08:03:22.4686607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4686686Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4686936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4687028Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4687235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4687322Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4687574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4687661Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4687921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4687992Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4688257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4688326Z context_layer = torch.bmm( 2025-09-07T08:03:22.4688329Z 2025-09-07T08:03:22.4688426Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4688619Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4688678Z return mod(**inputs) 2025-09-07T08:03:22.4688937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4689000Z outputs = self.deberta( 2025-09-07T08:03:22.4689258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4689323Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4689573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4689664Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4689865Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4689948Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4690194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4690279Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4690539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4690610Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4690863Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4691098Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4691102Z 2025-09-07T08:03:22.4691185Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691255Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691325Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691405Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691474Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691554Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691624Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691692Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691767Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691836Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4691931Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4692126Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4692189Z return mod(**inputs) 2025-09-07T08:03:22.4692452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4692515Z outputs = self.deberta( 2025-09-07T08:03:22.4692770Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4692837Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4693086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4693173Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4693378Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4693459Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4693710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4693799Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4694057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4694128Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4694382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4694559Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4694857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4694979Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4694987Z 2025-09-07T08:03:22.4695084Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4695278Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4695338Z return mod(**inputs) 2025-09-07T08:03:22.4695601Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4695664Z outputs = self.deberta( 2025-09-07T08:03:22.4695913Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4695988Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4696239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4696326Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4696560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4696717Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4696964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4697051Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4697308Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4697378Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4697635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4697837Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4697840Z 2025-09-07T08:03:22.4697949Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4698136Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4698198Z return mod(**inputs) 2025-09-07T08:03:22.4698461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4698525Z outputs = self.deberta( 2025-09-07T08:03:22.4698783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4698849Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4699099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4699188Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4699394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4699481Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4699729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4699817Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4700079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4700152Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4700412Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4700605Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4700607Z 2025-09-07T08:03:22.4700692Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4700771Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4700867Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4701060Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4701120Z return mod(**inputs) 2025-09-07T08:03:22.4701382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4701444Z outputs = self.deberta( 2025-09-07T08:03:22.4701694Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4701767Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4702016Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4702101Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4702363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4702445Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4702690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4702777Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4703028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4703098Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4703355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4703536Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4703823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4703954Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4703957Z 2025-09-07T08:03:22.4704054Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4704247Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4704307Z return mod(**inputs) 2025-09-07T08:03:22.4704567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4704629Z outputs = self.deberta( 2025-09-07T08:03:22.4704878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4704952Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4705204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4705294Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4705496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4705569Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4705826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4705913Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4706166Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4706237Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4706491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4706561Z context_layer = torch.bmm( 2025-09-07T08:03:22.4706565Z 2025-09-07T08:03:22.4706658Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4706850Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4706909Z return mod(**inputs) 2025-09-07T08:03:22.4707165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4707226Z outputs = self.deberta( 2025-09-07T08:03:22.4707470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4707542Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4707788Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4707937Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4708142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4708224Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4708470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4708554Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4708806Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4708877Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4709131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4709307Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4709312Z 2025-09-07T08:03:22.4709385Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709464Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709533Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709610Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709686Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709753Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709829Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709900Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4709968Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4710045Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4710140Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4710335Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4710395Z return mod(**inputs) 2025-09-07T08:03:22.4710651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4710723Z outputs = self.deberta( 2025-09-07T08:03:22.4710969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4711044Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4711291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4711376Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4711578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4711650Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4711908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4711996Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4712250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4712321Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4712567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4712753Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4713038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4713168Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4713171Z 2025-09-07T08:03:22.4713297Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4713519Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4713581Z return mod(**inputs) 2025-09-07T08:03:22.4713832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4713903Z outputs = self.deberta( 2025-09-07T08:03:22.4714150Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4714225Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4714471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4714548Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4714761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4714839Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4715093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4715179Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4715431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4715501Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4715746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4715951Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4715954Z 2025-09-07T08:03:22.4716050Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4716244Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4716307Z return mod(**inputs) 2025-09-07T08:03:22.4716560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4716628Z outputs = self.deberta( 2025-09-07T08:03:22.4716877Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4716953Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4717199Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4717286Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4717488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4717565Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4717821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4717908Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4718162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4718233Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4718480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4718676Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4718680Z 2025-09-07T08:03:22.4718751Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4718831Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4718955Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4722547Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4722617Z return mod(**inputs) 2025-09-07T08:03:22.4722880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4722954Z outputs = self.deberta( 2025-09-07T08:03:22.4723205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4723270Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4723524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4723605Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4723821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4723896Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4724186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4724284Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4724534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4724614Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4724860Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4725037Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4725331Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4725455Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4725461Z 2025-09-07T08:03:22.4725566Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4725754Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4725814Z return mod(**inputs) 2025-09-07T08:03:22.4726075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4726140Z outputs = self.deberta( 2025-09-07T08:03:22.4726397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4726463Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4726718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4726799Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4727003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4727086Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4727334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4727427Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4727674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4727744Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4728002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4728105Z context_layer = torch.bmm( 2025-09-07T08:03:22.4728127Z 2025-09-07T08:03:22.4728234Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4728464Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4728532Z return mod(**inputs) 2025-09-07T08:03:22.4728782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4728845Z outputs = self.deberta( 2025-09-07T08:03:22.4729099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4729165Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4729419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4729498Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4729704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4729789Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4730034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4730130Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4730375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4730452Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4730700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4730880Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4730883Z 2025-09-07T08:03:22.4730970Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731043Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731123Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731193Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731260Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731338Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731406Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731480Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731551Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731619Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4731720Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4731901Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4731968Z return mod(**inputs) 2025-09-07T08:03:22.4732223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4732288Z outputs = self.deberta( 2025-09-07T08:03:22.4732543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4732609Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4732859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4732938Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4733137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4733216Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4733463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4733591Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4733853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4733958Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4734207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4734383Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4734676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4734796Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4734799Z 2025-09-07T08:03:22.4734902Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4735090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4735153Z return mod(**inputs) 2025-09-07T08:03:22.4735416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4735479Z outputs = self.deberta( 2025-09-07T08:03:22.4735735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4735798Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4736053Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4736132Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4736333Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4736416Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4736670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4736766Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4737011Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4737081Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4737338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4737531Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4737535Z 2025-09-07T08:03:22.4737640Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4737826Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4737898Z return mod(**inputs) 2025-09-07T08:03:22.4738153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4738215Z outputs = self.deberta( 2025-09-07T08:03:22.4738471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4738538Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4738794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4738875Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4739076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4739157Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4739454Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4739579Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4739829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4739909Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4740162Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4740352Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4740355Z 2025-09-07T08:03:22.4740435Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4740507Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4740611Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4740796Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4740860Z return mod(**inputs) 2025-09-07T08:03:22.4741122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4741184Z outputs = self.deberta( 2025-09-07T08:03:22.4741440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4741504Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4741761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4741840Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4742043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4742124Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4742376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4742471Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4742720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4742789Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4743049Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4743231Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4743524Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4743644Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4743650Z 2025-09-07T08:03:22.4743753Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4743936Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4743995Z return mod(**inputs) 2025-09-07T08:03:22.4744255Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4744320Z outputs = self.deberta( 2025-09-07T08:03:22.4744578Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4744643Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4744893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4745008Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4745224Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4745323Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4745571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4745663Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4745912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4745981Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4746239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4746305Z context_layer = torch.bmm( 2025-09-07T08:03:22.4746308Z 2025-09-07T08:03:22.4746413Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4746599Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4746660Z return mod(**inputs) 2025-09-07T08:03:22.4746918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4746981Z outputs = self.deberta( 2025-09-07T08:03:22.4747237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4747300Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4747556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4747633Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4747837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4747918Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4748167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4748263Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4748513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4748583Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4748842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4749015Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4749018Z 2025-09-07T08:03:22.4749098Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749172Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749253Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749325Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749393Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749470Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749538Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749605Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749681Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749750Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4749851Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4750036Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4750097Z return mod(**inputs) 2025-09-07T08:03:22.4750359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4750450Z outputs = self.deberta( 2025-09-07T08:03:22.4750718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4750809Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4751064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4751145Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4751347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4751430Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4751676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4751769Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4752019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4752091Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4752347Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4752521Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4752814Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4752937Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4752940Z 2025-09-07T08:03:22.4753044Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4753229Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4753292Z return mod(**inputs) 2025-09-07T08:03:22.4753555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4753620Z outputs = self.deberta( 2025-09-07T08:03:22.4753878Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4753945Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4754194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4754283Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4754486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4754569Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4754821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4754915Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4755164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4755232Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4755488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4755678Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4755681Z 2025-09-07T08:03:22.4755783Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4755967Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4756034Z return mod(**inputs) 2025-09-07T08:03:22.4756319Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4756414Z outputs = self.deberta( 2025-09-07T08:03:22.4756671Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4756737Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4756996Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4757076Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4757282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4757360Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4757609Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4757710Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4757959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4758027Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4758285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4758475Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4758478Z 2025-09-07T08:03:22.4758556Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4758627Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4758729Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4758913Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4758978Z return mod(**inputs) 2025-09-07T08:03:22.4759240Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4759303Z outputs = self.deberta( 2025-09-07T08:03:22.4759559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4759623Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4759874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4759960Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4760164Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4760245Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4760496Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4760590Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4760837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4760906Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4761161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4761337Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4761630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4761761Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4761765Z 2025-09-07T08:03:22.4761902Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4762108Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4762167Z return mod(**inputs) 2025-09-07T08:03:22.4762425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4762486Z outputs = self.deberta( 2025-09-07T08:03:22.4762737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4762809Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4763057Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4763140Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4763344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4763417Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4763669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4763753Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4764005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4764073Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4764328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4764393Z context_layer = torch.bmm( 2025-09-07T08:03:22.4764396Z 2025-09-07T08:03:22.4764490Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4764687Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4764749Z return mod(**inputs) 2025-09-07T08:03:22.4765009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4765070Z outputs = self.deberta( 2025-09-07T08:03:22.4765318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4765393Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4765642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4765726Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4765931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4766003Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4766259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4766346Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4766600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4766671Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4766924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4767094Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4767097Z 2025-09-07T08:03:22.4767168Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767245Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767314Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767433Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767518Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767587Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767663Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767730Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767802Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767872Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4767966Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4768156Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4768215Z return mod(**inputs) 2025-09-07T08:03:22.4768473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4768538Z outputs = self.deberta( 2025-09-07T08:03:22.4768789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4768863Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4769114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4769200Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4769404Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4769474Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4769726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4769810Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4770060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4770134Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4770385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4770556Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4770841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4770963Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4770966Z 2025-09-07T08:03:22.4771061Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4771250Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4771311Z return mod(**inputs) 2025-09-07T08:03:22.4771572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4771636Z outputs = self.deberta( 2025-09-07T08:03:22.4771885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4771958Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4772205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4772290Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4772491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4772563Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4772820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4772930Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4773218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4773286Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4773542Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4773737Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4773740Z 2025-09-07T08:03:22.4773835Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4774025Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4774083Z return mod(**inputs) 2025-09-07T08:03:22.4774343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4774408Z outputs = self.deberta( 2025-09-07T08:03:22.4774660Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4774733Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4774987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4775072Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4775274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4775354Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4775600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4775684Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4775945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4776017Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4776273Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4776465Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4776468Z 2025-09-07T08:03:22.4776539Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4776615Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4776709Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4776899Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4776959Z return mod(**inputs) 2025-09-07T08:03:22.4777219Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4777283Z outputs = self.deberta( 2025-09-07T08:03:22.4777531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4777605Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4777852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4777939Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4778141Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4778214Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4778470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4778605Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4778884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4778954Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4779204Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4779386Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4779674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4779800Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4779803Z 2025-09-07T08:03:22.4779899Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4780091Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4780153Z return mod(**inputs) 2025-09-07T08:03:22.4780406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4780476Z outputs = self.deberta( 2025-09-07T08:03:22.4780727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4780798Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4781192Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4781280Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4781480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4781555Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4781807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4781893Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4782139Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4782209Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4782452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4782522Z context_layer = torch.bmm( 2025-09-07T08:03:22.4782525Z 2025-09-07T08:03:22.4782617Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4782804Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4782866Z return mod(**inputs) 2025-09-07T08:03:22.4783125Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4783188Z outputs = self.deberta( 2025-09-07T08:03:22.4783432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4783501Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4783747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4783828Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4784029Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4784099Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4784407Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4784540Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4784793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4784860Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4785107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4785286Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4785289Z 2025-09-07T08:03:22.4785361Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785440Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785509Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785581Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785651Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785719Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785795Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785861Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4785934Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4786001Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4786093Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4786284Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4786344Z return mod(**inputs) 2025-09-07T08:03:22.4786600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4786664Z outputs = self.deberta( 2025-09-07T08:03:22.4786911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4786988Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4787239Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4787328Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4787529Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4787599Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4787854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4787939Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4788193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4788265Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4788525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4788703Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4788986Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4789114Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4789117Z 2025-09-07T08:03:22.4789210Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4789401Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4789460Z return mod(**inputs) 2025-09-07T08:03:22.4789711Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4789826Z outputs = self.deberta( 2025-09-07T08:03:22.4790088Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4790162Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4790408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4790495Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4790696Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4790769Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4791022Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4791105Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4791364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4791438Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4791681Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4791875Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4791878Z 2025-09-07T08:03:22.4791971Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4792162Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4792221Z return mod(**inputs) 2025-09-07T08:03:22.4792480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4792543Z outputs = self.deberta( 2025-09-07T08:03:22.4792794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4792868Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4793115Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4793201Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4793403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4793474Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4793724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4793807Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4794060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4794131Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4794380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4794569Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4794572Z 2025-09-07T08:03:22.4794642Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4794720Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4794815Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4795006Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4795064Z return mod(**inputs) 2025-09-07T08:03:22.4795313Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4795441Z outputs = self.deberta( 2025-09-07T08:03:22.4795699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4795771Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4796018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4796100Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4796299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4796370Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4796623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4796706Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4796959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4797033Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4797279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4797461Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4797745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4797871Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4797874Z 2025-09-07T08:03:22.4797968Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4798159Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4798221Z return mod(**inputs) 2025-09-07T08:03:22.4798474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4798547Z outputs = self.deberta( 2025-09-07T08:03:22.4798793Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4798867Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4799112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4799192Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4799401Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4799473Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4799729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4799818Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4800070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4800143Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4800387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4800458Z context_layer = torch.bmm( 2025-09-07T08:03:22.4800461Z 2025-09-07T08:03:22.4800557Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4800746Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4800807Z return mod(**inputs) 2025-09-07T08:03:22.4801082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4801182Z outputs = self.deberta( 2025-09-07T08:03:22.4801425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4801499Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4801743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4801829Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4802031Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4802104Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4802356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4802444Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4802701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4802770Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4803017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4803196Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4803200Z 2025-09-07T08:03:22.4803272Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803351Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803419Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803487Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803562Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803629Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803708Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803776Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803845Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4803923Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4804017Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4804207Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4804266Z return mod(**inputs) 2025-09-07T08:03:22.4804515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4804584Z outputs = self.deberta( 2025-09-07T08:03:22.4804832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4804903Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4805153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4805240Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4805441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4805514Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4805767Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4805850Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4806102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4806171Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4806444Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4806654Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4806943Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4807071Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4807074Z 2025-09-07T08:03:22.4807169Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4807362Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4807423Z return mod(**inputs) 2025-09-07T08:03:22.4807679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4807750Z outputs = self.deberta( 2025-09-07T08:03:22.4808001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4808074Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4808323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4808401Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4808610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4808683Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4808940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4809024Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4809283Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4809353Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4809603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4809802Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4809805Z 2025-09-07T08:03:22.4809898Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4810090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4810150Z return mod(**inputs) 2025-09-07T08:03:22.4810403Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4810473Z outputs = self.deberta( 2025-09-07T08:03:22.4810728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4810803Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4811052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4811136Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4811337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4811411Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4811668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4811752Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4812008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4812107Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4812370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4812585Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4812588Z 2025-09-07T08:03:22.4812662Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4812742Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4812835Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4813027Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4813088Z return mod(**inputs) 2025-09-07T08:03:22.4813343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4813413Z outputs = self.deberta( 2025-09-07T08:03:22.4813668Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4813743Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4813992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4814071Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4814284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4814356Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4814613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4814700Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4814957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4815030Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4815282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4815468Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4815754Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4815882Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4815885Z 2025-09-07T08:03:22.4815979Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4816162Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4816232Z return mod(**inputs) 2025-09-07T08:03:22.4816491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4816568Z outputs = self.deberta( 2025-09-07T08:03:22.4816816Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4816897Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4817146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4817225Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4817436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4817510Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4817796Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4817897Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4818183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4818259Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4818507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4818577Z context_layer = torch.bmm( 2025-09-07T08:03:22.4818580Z 2025-09-07T08:03:22.4818675Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4818864Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4818923Z return mod(**inputs) 2025-09-07T08:03:22.4819174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4819250Z outputs = self.deberta( 2025-09-07T08:03:22.4819495Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4819567Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4819813Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4819890Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4820100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4820171Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4820424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4820507Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4820761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4820831Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4821075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4821251Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4821254Z 2025-09-07T08:03:22.4821326Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821405Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821475Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821543Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821620Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821688Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821762Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821832Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821901Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4821980Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4822073Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4822260Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4822321Z return mod(**inputs) 2025-09-07T08:03:22.4822572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4822641Z outputs = self.deberta( 2025-09-07T08:03:22.4822885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4822957Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4823228Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4823321Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4823560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4823633Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4823885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4823970Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4824220Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4824290Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4824532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4824712Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4824993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4825115Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4825118Z 2025-09-07T08:03:22.4825211Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4825396Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4825455Z return mod(**inputs) 2025-09-07T08:03:22.4825702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4825772Z outputs = self.deberta( 2025-09-07T08:03:22.4826019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4826093Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4826341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4826419Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4826627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4826700Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4826950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4827033Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4827279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4827357Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4827606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4827804Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4827807Z 2025-09-07T08:03:22.4827901Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4828090Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4828150Z return mod(**inputs) 2025-09-07T08:03:22.4828400Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4828471Z outputs = self.deberta( 2025-09-07T08:03:22.4828717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4828815Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4829075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4829168Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4829377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4829450Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4829703Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4829788Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4830040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4830109Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4830356Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4830555Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4830558Z 2025-09-07T08:03:22.4830629Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4830708Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4830802Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4830986Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4831053Z return mod(**inputs) 2025-09-07T08:03:22.4831302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4831371Z outputs = self.deberta( 2025-09-07T08:03:22.4831618Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4831692Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4831939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4832016Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4832223Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4832297Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4832550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4832633Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4832879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4832958Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4833203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4833387Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4833670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4833794Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4833797Z 2025-09-07T08:03:22.4833890Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4834073Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4834143Z return mod(**inputs) 2025-09-07T08:03:22.4834420Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4834504Z outputs = self.deberta( 2025-09-07T08:03:22.4834772Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4834839Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4835089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4835168Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4835374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4835446Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4835697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4835783Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4836030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4836109Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4836355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4836426Z context_layer = torch.bmm( 2025-09-07T08:03:22.4836429Z 2025-09-07T08:03:22.4836522Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4836703Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4836771Z return mod(**inputs) 2025-09-07T08:03:22.4837019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4837086Z outputs = self.deberta( 2025-09-07T08:03:22.4837332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4837407Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4837653Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4837730Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4837937Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4838009Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4838264Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4838349Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4838595Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4838673Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4838918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4839097Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4839100Z 2025-09-07T08:03:22.4839170Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839246Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839315Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839384Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839461Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839529Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839601Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839669Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839767Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839858Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4839970Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4840151Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4840220Z return mod(**inputs) 2025-09-07T08:03:22.4840468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4840539Z outputs = self.deberta( 2025-09-07T08:03:22.4840783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4840854Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4841107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4841187Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4841399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4841472Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4841725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4841809Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4842055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4842131Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4842376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4842553Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4842839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4842969Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4842972Z 2025-09-07T08:03:22.4843065Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4843248Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4843317Z return mod(**inputs) 2025-09-07T08:03:22.4843568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4843637Z outputs = self.deberta( 2025-09-07T08:03:22.4843903Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4843970Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4844226Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4844303Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4844509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4844579Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4844829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4844914Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4845161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4845238Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4845515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4845984Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4845988Z 2025-09-07T08:03:22.4846081Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4846273Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4846334Z return mod(**inputs) 2025-09-07T08:03:22.4846586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4846656Z outputs = self.deberta( 2025-09-07T08:03:22.4846904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4846977Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4847229Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4847313Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4847523Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4847598Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4847854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4847941Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4848188Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4848269Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4848517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4848717Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4848722Z 2025-09-07T08:03:22.4848795Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4848874Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4848969Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4849152Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4849220Z return mod(**inputs) 2025-09-07T08:03:22.4849469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4849536Z outputs = self.deberta( 2025-09-07T08:03:22.4849786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4849853Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4850109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4850190Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4850396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4850470Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4850722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4850807Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4851054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4851133Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4851409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4851619Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4851901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4852019Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4852029Z 2025-09-07T08:03:22.4852123Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4852307Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4852374Z return mod(**inputs) 2025-09-07T08:03:22.4852626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4852694Z outputs = self.deberta( 2025-09-07T08:03:22.4852948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4853017Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4853277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4853358Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4853571Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4853645Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4853895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4853991Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4854243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4854328Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4854580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4854659Z context_layer = torch.bmm( 2025-09-07T08:03:22.4854662Z 2025-09-07T08:03:22.4854757Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4854942Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4855014Z return mod(**inputs) 2025-09-07T08:03:22.4855268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4855341Z outputs = self.deberta( 2025-09-07T08:03:22.4855594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4855663Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4855922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4856004Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4856218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4856295Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4856550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4856637Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4856887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4856969Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4857268Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4857466Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4857469Z 2025-09-07T08:03:22.4857541Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857611Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857686Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857756Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857832Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857900Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4857968Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4858045Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4858115Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4858188Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4858286Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4858470Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4858535Z return mod(**inputs) 2025-09-07T08:03:22.4858785Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4858855Z outputs = self.deberta( 2025-09-07T08:03:22.4859099Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4859163Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4859419Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4859499Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4859707Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4859784Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4860039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4860125Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4860371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4860448Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4860693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 236, in forward 2025-09-07T08:03:22.4860872Z query_layer = self.transpose_for_scores(self.query_proj(query_states), self.num_attention_heads) 2025-09-07T08:03:22.4861154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4861282Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4861285Z 2025-09-07T08:03:22.4861381Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4861563Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4861631Z return mod(**inputs) 2025-09-07T08:03:22.4861881Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4861948Z outputs = self.deberta( 2025-09-07T08:03:22.4862193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4862258Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4862541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4862649Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4862859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4862931Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4863178Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4863270Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4863518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4863595Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4863843Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4864043Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4864048Z 2025-09-07T08:03:22.4864143Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4864327Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4864394Z return mod(**inputs) 2025-09-07T08:03:22.4864652Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4864721Z outputs = self.deberta( 2025-09-07T08:03:22.4864974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4865045Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4865295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4865376Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4865588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4865662Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4865917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4866002Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4866248Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4866323Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4866576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 248, in forward 2025-09-07T08:03:22.4866774Z attention_scores = torch.bmm(query_layer, key_layer.transpose(-1, -2) / scale.to(dtype=query_layer.dtype)) 2025-09-07T08:03:22.4866780Z 2025-09-07T08:03:22.4866850Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4866926Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4867018Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4867200Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4867265Z return mod(**inputs) 2025-09-07T08:03:22.4867513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4867586Z outputs = self.deberta( 2025-09-07T08:03:22.4867831Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4867895Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4868177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4868283Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4868489Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4868560Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4868808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4868902Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4869148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4869228Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4869473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 238, in forward 2025-09-07T08:03:22.4869661Z value_layer = self.transpose_for_scores(self.value_proj(hidden_states), self.num_attention_heads) 2025-09-07T08:03:22.4869942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 194, in transpose_for_scores 2025-09-07T08:03:22.4870061Z return x.permute(0, 2, 1, 3).contiguous().view(-1, x.size(1), x.size(-1)) 2025-09-07T08:03:22.4870064Z 2025-09-07T08:03:22.4870164Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4870346Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4870412Z return mod(**inputs) 2025-09-07T08:03:22.4870664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4870735Z outputs = self.deberta( 2025-09-07T08:03:22.4870982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4871051Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4871302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4871380Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4871590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4871661Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4871906Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4872002Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4872247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4872327Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4872576Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 268, in forward 2025-09-07T08:03:22.4872639Z context_layer = torch.bmm( 2025-09-07T08:03:22.4872646Z 2025-09-07T08:03:22.4872738Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4872915Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4872980Z return mod(**inputs) 2025-09-07T08:03:22.4873227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1231, in forward 2025-09-07T08:03:22.4873295Z outputs = self.deberta( 2025-09-07T08:03:22.4873538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 786, in forward 2025-09-07T08:03:22.4873601Z encoder_outputs = self.encoder( 2025-09-07T08:03:22.4873896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 659, in forward 2025-09-07T08:03:22.4873989Z output_states, attn_weights = layer_module( 2025-09-07T08:03:22.4874197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:22.4874269Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:22.4874519Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 438, in forward 2025-09-07T08:03:22.4874611Z attention_output, att_matrix = self.attention( 2025-09-07T08:03:22.4874857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 371, in forward 2025-09-07T08:03:22.4874934Z self_output, att_matrix = self.self( 2025-09-07T08:03:22.4875185Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 272, in forward 2025-09-07T08:03:22.4875371Z context_layer.view(-1, self.num_attention_heads, context_layer.size(-2), context_layer.size(-1)) 2025-09-07T08:03:22.4875374Z 2025-09-07T08:03:22.4875448Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875518Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875597Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875667Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875743Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875813Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875880Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4875956Z cudagraph partition due to non gpu ops 2025-09-07T08:03:22.4876050Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4876237Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4876299Z return mod(**inputs) 2025-09-07T08:03:22.4876554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1262, in forward 2025-09-07T08:03:22.4876661Z start_loss = loss_fct(start_logits, start_positions) 2025-09-07T08:03:22.4876664Z 2025-09-07T08:03:22.4876756Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:22.4876945Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:22.4877006Z return mod(**inputs) 2025-09-07T08:03:22.4877258Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/deberta_v2/modeling_deberta_v2.py", line 1263, in forward 2025-09-07T08:03:22.4877348Z end_loss = loss_fct(end_logits, end_positions) 2025-09-07T08:03:22.4877351Z 2025-09-07T08:03:36.0961973Z pass 2025-09-07T08:03:36.0962391Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:38.8616643Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:03:38.8617572Z import pynvml # type: ignore[import] 2025-09-07T08:03:41.0933000Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:03:41.0933843Z from pkg_resources import resource_filename 2025-09-07T08:03:41.6919472Z 2025-09-07T08:03:42.3947883Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:03:42.3948226Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:03:42.3948547Z cpu eval DistilBertForMaskedLM 2025-09-07T08:03:42.4806230Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:42.5150873Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:42.5477321Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:49.0392618Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0393021Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0393267Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0393462Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0393657Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0393843Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0394039Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0394233Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0394420Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0394641Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0394840Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0395046Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0395249Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0395471Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0395828Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0396151Z return mod(**inputs) 2025-09-07T08:03:49.0396588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0396991Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0397392Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0397778Z return self.transformer( 2025-09-07T08:03:49.0398148Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0398532Z layer_outputs = layer_module( 2025-09-07T08:03:49.0398861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0399207Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0399594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0399970Z sa_output = self.attention( 2025-09-07T08:03:49.0400343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0400779Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0400952Z 2025-09-07T08:03:49.0401034Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0401254Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0401449Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0401640Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0401821Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402007Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402198Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402384Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402565Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402752Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0402943Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0403132Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0403316Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0403537Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0403876Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0404185Z return mod(**inputs) 2025-09-07T08:03:49.0404932Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0405360Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0405741Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0406120Z return self.transformer( 2025-09-07T08:03:49.0406490Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0406862Z layer_outputs = layer_module( 2025-09-07T08:03:49.0407194Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0407533Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0407921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0408295Z sa_output = self.attention( 2025-09-07T08:03:49.0408665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0409093Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0409261Z 2025-09-07T08:03:49.0409341Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0409532Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0409718Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0409910Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0410097Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0410286Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0410467Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0410656Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0410840Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0411032Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0411219Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0411407Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0411595Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0411811Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0412137Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0412441Z return mod(**inputs) 2025-09-07T08:03:49.0412800Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0413183Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0413559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0413929Z return self.transformer( 2025-09-07T08:03:49.0414299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0414681Z layer_outputs = layer_module( 2025-09-07T08:03:49.0415004Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0415338Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0415713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0416088Z sa_output = self.attention( 2025-09-07T08:03:49.0416450Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0416876Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0417044Z 2025-09-07T08:03:49.0417114Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0417351Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0417567Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0417780Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0417965Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0418156Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0418347Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0418536Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0418725Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0418906Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0419093Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0419282Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0419469Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0419676Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0420009Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0420311Z return mod(**inputs) 2025-09-07T08:03:49.0420670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0421044Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0421417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0421793Z return self.transformer( 2025-09-07T08:03:49.0422163Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0422536Z layer_outputs = layer_module( 2025-09-07T08:03:49.0422848Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0423189Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0423574Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0423953Z sa_output = self.attention( 2025-09-07T08:03:49.0424314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0424740Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0424912Z 2025-09-07T08:03:49.0424984Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0425181Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0425376Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0425558Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0425749Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0425941Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0426128Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0426310Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0426504Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0426696Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0426891Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0427076Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0427265Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0427483Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0427816Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0428111Z return mod(**inputs) 2025-09-07T08:03:49.0428472Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0428849Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0429227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0429609Z return self.transformer( 2025-09-07T08:03:49.0430008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0430426Z layer_outputs = layer_module( 2025-09-07T08:03:49.0430752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0431090Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0431464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0431843Z sa_output = self.attention( 2025-09-07T08:03:49.0432212Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0432639Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0432804Z 2025-09-07T08:03:49.0432882Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0433071Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0433265Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0433454Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0433645Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0433826Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434018Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434205Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434395Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434576Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434791Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0434983Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0435175Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0435391Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0435715Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0436016Z return mod(**inputs) 2025-09-07T08:03:49.0436385Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 826, in forward 2025-09-07T08:03:49.0436771Z dlbrt_output = self.distilbert( 2025-09-07T08:03:49.0437147Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:03:49.0437521Z return self.transformer( 2025-09-07T08:03:49.0437890Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:03:49.0438275Z layer_outputs = layer_module( 2025-09-07T08:03:49.0438605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:03:49.0438934Z return super().__call__(*args, **kwargs) 2025-09-07T08:03:49.0439323Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:03:49.0439707Z sa_output = self.attention( 2025-09-07T08:03:49.0440078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:03:49.0440509Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:03:49.0440676Z 2025-09-07T08:03:49.0440751Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0440945Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0441137Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0441327Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0441508Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0441698Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0441887Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0442076Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0442259Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0443654Z cudagraph partition due to non gpu ops 2025-09-07T08:03:49.0443912Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:03:49.0444252Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:03:49.0444556Z return mod(**inputs) 2025-09-07T08:03:49.0444914Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 843, in forward 2025-09-07T08:03:49.0445418Z mlm_loss = self.mlm_loss_fct(prediction_logits.view(-1, prediction_logits.size(-1)), labels.view(-1)) 2025-09-07T08:03:49.0445660Z 2025-09-07T08:03:56.9810676Z pass 2025-09-07T08:03:56.9811985Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:03:59.2754035Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:03:59.2754877Z import pynvml # type: ignore[import] 2025-09-07T08:04:01.4985599Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:04:01.4986442Z from pkg_resources import resource_filename 2025-09-07T08:04:02.0479052Z 2025-09-07T08:04:02.5920068Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:04:02.5920413Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:04:02.5920727Z cpu eval DistilBertForQuestionAnswering 2025-09-07T08:04:02.6556162Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:02.6860464Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:02.7159294Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:09.1493548Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1493898Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1494108Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1494331Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1494520Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1494713Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1494949Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1495322Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1495639Z return mod(**inputs) 2025-09-07T08:04:09.1496052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1043, in forward 2025-09-07T08:04:09.1496557Z logits = self.qa_outputs(hidden_states) # (bs, max_query_len, 2) 2025-09-07T08:04:09.1496741Z 2025-09-07T08:04:09.1496823Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497024Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497215Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497411Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497607Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497805Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1497990Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1498215Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1498577Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1498896Z return mod(**inputs) 2025-09-07T08:04:09.1499627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1500112Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1500610Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1501010Z return self.transformer( 2025-09-07T08:04:09.1501394Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1501781Z layer_outputs = layer_module( 2025-09-07T08:04:09.1502118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1502467Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1502866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1503254Z sa_output = self.attention( 2025-09-07T08:04:09.1503638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1504094Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1504279Z 2025-09-07T08:04:09.1504353Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1504555Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1504743Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1504940Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1505134Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1505325Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1505515Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1505709Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1505904Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1506097Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1506282Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1506478Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1506675Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1506901Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1507239Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1507552Z return mod(**inputs) 2025-09-07T08:04:09.1507931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1508335Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1508734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1509120Z return self.transformer( 2025-09-07T08:04:09.1509497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1509886Z layer_outputs = layer_module( 2025-09-07T08:04:09.1510222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1510571Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1510962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1511354Z sa_output = self.attention( 2025-09-07T08:04:09.1511733Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1512177Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1512352Z 2025-09-07T08:04:09.1512434Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1512625Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1512822Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1513072Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1513295Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1513513Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1513707Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1513900Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1514097Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1514289Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1514483Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1514711Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1514902Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1515126Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1515460Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1515766Z return mod(**inputs) 2025-09-07T08:04:09.1516144Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1516547Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1516944Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1517324Z return self.transformer( 2025-09-07T08:04:09.1517704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1518096Z layer_outputs = layer_module( 2025-09-07T08:04:09.1518424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1518759Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1519153Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1519543Z sa_output = self.attention( 2025-09-07T08:04:09.1519923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1520372Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1520544Z 2025-09-07T08:04:09.1520619Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1520817Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521014Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521209Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521397Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521592Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521789Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1521983Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1522169Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1522365Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1522557Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1522753Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1522943Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1523168Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1523515Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1523826Z return mod(**inputs) 2025-09-07T08:04:09.1524202Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1524595Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1524990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1525378Z return self.transformer( 2025-09-07T08:04:09.1525753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1527167Z layer_outputs = layer_module( 2025-09-07T08:04:09.1527545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1527895Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1528303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1528698Z sa_output = self.attention( 2025-09-07T08:04:09.1529071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1529514Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1529694Z 2025-09-07T08:04:09.1529767Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1529963Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1530161Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1530355Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1530553Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1530751Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1530944Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1531130Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1531324Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1531519Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1531713Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1531904Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1532100Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1532322Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1532663Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1532964Z return mod(**inputs) 2025-09-07T08:04:09.1533350Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1533758Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1534154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1534545Z return self.transformer( 2025-09-07T08:04:09.1534918Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1535308Z layer_outputs = layer_module( 2025-09-07T08:04:09.1535643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1535992Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1536383Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1536777Z sa_output = self.attention( 2025-09-07T08:04:09.1537161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1537608Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1537780Z 2025-09-07T08:04:09.1537860Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1538054Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1538253Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1538448Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1538642Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1538835Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1539030Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1539226Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1539419Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1539608Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1539806Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1540074Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1540289Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1540505Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1540851Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1541163Z return mod(**inputs) 2025-09-07T08:04:09.1541537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1031, in forward 2025-09-07T08:04:09.1541937Z distilbert_output = self.distilbert( 2025-09-07T08:04:09.1542329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 736, in forward 2025-09-07T08:04:09.1542721Z return self.transformer( 2025-09-07T08:04:09.1543100Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 541, in forward 2025-09-07T08:04:09.1543494Z layer_outputs = layer_module( 2025-09-07T08:04:09.1543822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:04:09.1544171Z return super().__call__(*args, **kwargs) 2025-09-07T08:04:09.1544575Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 476, in forward 2025-09-07T08:04:09.1544963Z sa_output = self.attention( 2025-09-07T08:04:09.1545344Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 402, in forward 2025-09-07T08:04:09.1545784Z attn_output = torch.nn.functional.scaled_dot_product_attention( 2025-09-07T08:04:09.1545966Z 2025-09-07T08:04:09.1546042Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1546244Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1546443Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1546636Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1546838Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1547038Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1547237Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1547433Z cudagraph partition due to non gpu ops 2025-09-07T08:04:09.1547651Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1548000Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1548313Z return mod(**inputs) 2025-09-07T08:04:09.1548695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1061, in forward 2025-09-07T08:04:09.1549112Z start_loss = loss_fct(start_logits, start_positions) 2025-09-07T08:04:09.1549286Z 2025-09-07T08:04:09.1549387Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:04:09.1549732Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:04:09.1550042Z return mod(**inputs) 2025-09-07T08:04:09.1550414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/distilbert/modeling_distilbert.py", line 1062, in forward 2025-09-07T08:04:09.1550823Z end_loss = loss_fct(end_logits, end_positions) 2025-09-07T08:04:09.1550972Z 2025-09-07T08:04:17.1272017Z pass 2025-09-07T08:04:17.1272445Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:18.6699451Z accuracy pass_rate=86.67% 2025-09-07T08:04:18.6701130Z calls_captured gmean=0.00x mean=472.667x 2025-09-07T08:04:18.6701453Z unique_graphs gmean=0.00x mean=1.133x 2025-09-07T08:04:18.6701698Z graph_breaks gmean=0.00x mean=0.267x 2025-09-07T08:04:18.6702255Z unique_graph_breaks gmean=0.00x mean=0.067x 2025-09-07T08:04:18.6704071Z autograd_captures gmean=0.00x mean=0.000x 2025-09-07T08:04:18.6706342Z autograd_compiles gmean=0.00x mean=0.000x 2025-09-07T08:04:18.6708120Z cudagraph_skips gmean=0.00x mean=1.067x 2025-09-07T08:04:18.6708522Z compilation_latency mean=18.111 seconds 2025-09-07T08:04:19.2333776Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *cppwrapper-true* ]] 2025-09-07T08:04:19.2334335Z + TORCHINDUCTOR_CPP_WRAPPER=1 2025-09-07T08:04:19.2335214Z + taskset -c 0-93 python benchmarks/dynamo/huggingface.py --accuracy --no-translation-validation --freezing --inference --amp --backend inductor --disable-cudagraphs --device cpu --total-partitions 3 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_accuracy.csv 2025-09-07T08:04:19.9141698Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:04:19.9142599Z import pynvml # type: ignore[import] 2025-09-07T08:04:22.1433206Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:04:22.1434035Z from pkg_resources import resource_filename 2025-09-07T08:04:23.4042242Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:04:23.4043101Z import pynvml # type: ignore[import] 2025-09-07T08:04:25.6335332Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:04:25.6336212Z from pkg_resources import resource_filename 2025-09-07T08:04:26.1587965Z 2025-09-07T08:04:27.8872991Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:04:27.8873391Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:04:27.8873739Z cpu eval AlbertForMaskedLM 2025-09-07T08:04:30.4142753Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:30.7496627Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:31.0860804Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:54.6861861Z pass 2025-09-07T08:04:54.6862334Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:04:57.1767772Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:04:57.1768677Z import pynvml # type: ignore[import] 2025-09-07T08:04:59.4094799Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:04:59.4095644Z from pkg_resources import resource_filename 2025-09-07T08:04:59.9623786Z 2025-09-07T08:05:01.6295825Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:05:01.6296589Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:05:01.6296951Z cpu eval AlbertForQuestionAnswering 2025-09-07T08:05:04.1509976Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:04.5032315Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:04.8330218Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:26.2663725Z pass 2025-09-07T08:05:26.2664147Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:28.8058981Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:05:28.8059910Z import pynvml # type: ignore[import] 2025-09-07T08:05:31.0483605Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:05:31.0484510Z from pkg_resources import resource_filename 2025-09-07T08:05:31.6340308Z 2025-09-07T08:05:33.4873412Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:05:33.4874443Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:05:33.4875189Z cpu eval AllenaiLongformerBase 2025-09-07T08:05:34.0303585Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:34.2873989Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:34.5431492Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:05:34.6903143Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] Graph break from `Tensor.item()`, consider setting: 2025-09-07T08:05:34.6903782Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] torch._dynamo.config.capture_scalar_outputs = True 2025-09-07T08:05:34.6904243Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] or: 2025-09-07T08:05:34.6904688Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] env TORCHDYNAMO_CAPTURE_SCALAR_OUTPUTS=1 2025-09-07T08:05:34.6905207Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] to include these operations in the captured graph. 2025-09-07T08:05:34.6905636Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] 2025-09-07T08:05:34.6906059Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] Graph break: from user code at: 2025-09-07T08:05:34.6906683Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] File "/var/lib/jenkins/workspace/benchmarks/dynamo/huggingface.py", line 533, in forward_pass 2025-09-07T08:05:34.6907265Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] return mod(**inputs) 2025-09-07T08:05:34.6907953Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1703, in forward 2025-09-07T08:05:34.6908632Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] outputs = self.longformer( 2025-09-07T08:05:34.6909645Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1600, in forward 2025-09-07T08:05:34.6910468Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] encoder_outputs = self.encoder( 2025-09-07T08:05:34.6911153Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1244, in forward 2025-09-07T08:05:34.6911881Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] is_global_attn = is_index_global_attn.flatten().any().item() 2025-09-07T08:05:34.6912341Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] 2025-09-07T08:05:34.6912686Z W0907 08:05:34.689031 11433 site-packages/torch/_dynamo/variables/tensor.py:1048] [0/0] 2025-09-07T08:06:28.6190446Z pass 2025-09-07T08:06:28.6190914Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:06:31.5827440Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:06:31.5828499Z import pynvml # type: ignore[import] 2025-09-07T08:06:33.8121307Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:06:33.8122160Z from pkg_resources import resource_filename 2025-09-07T08:06:34.3931122Z 2025-09-07T08:06:36.7836029Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:06:36.7837903Z loading model: 0it [00:02, ?it/s] 2025-09-07T08:06:36.7839090Z cpu eval BartForCausalLM 2025-09-07T08:06:37.3869619Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:06:37.5395129Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:06:37.6886001Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:00.7277738Z pass 2025-09-07T08:07:00.7278902Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:03.2624197Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:07:03.2625115Z import pynvml # type: ignore[import] 2025-09-07T08:07:05.4879648Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:07:05.4880621Z from pkg_resources import resource_filename 2025-09-07T08:07:06.0598051Z 2025-09-07T08:07:10.4268192Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:07:10.4268516Z loading model: 0it [00:04, ?it/s] 2025-09-07T08:07:10.4268807Z cpu eval BartForConditionalGeneration 2025-09-07T08:07:11.6571801Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:11.9753503Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:12.2912195Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:54.1991541Z pass 2025-09-07T08:07:54.1992428Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:07:57.0928204Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:07:57.0929115Z import pynvml # type: ignore[import] 2025-09-07T08:07:59.3229953Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:07:59.3230827Z from pkg_resources import resource_filename 2025-09-07T08:07:59.9446279Z 2025-09-07T08:08:01.0428522Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:08:01.0428881Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:08:01.0429194Z cpu eval BertForMaskedLM 2025-09-07T08:08:01.2677524Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:01.3492988Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:01.4251140Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:23.0428705Z pass 2025-09-07T08:08:23.0430089Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:25.4635294Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:08:25.4636228Z import pynvml # type: ignore[import] 2025-09-07T08:08:27.6943751Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:08:27.6944618Z from pkg_resources import resource_filename 2025-09-07T08:08:28.2730440Z 2025-09-07T08:08:29.2044705Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:08:29.2045065Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:08:29.2045372Z cpu eval BertForQuestionAnswering 2025-09-07T08:08:29.3835962Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:29.4537687Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:29.5240479Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:50.9586341Z pass 2025-09-07T08:08:50.9586797Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:08:53.3443798Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:08:53.3444607Z import pynvml # type: ignore[import] 2025-09-07T08:08:55.5726050Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:08:55.5727234Z from pkg_resources import resource_filename 2025-09-07T08:08:56.1030416Z 2025-09-07T08:09:14.0888918Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:09:14.0889270Z loading model: 0it [00:17, ?it/s] 2025-09-07T08:09:14.0889563Z cpu eval BlenderbotForCausalLM 2025-09-07T08:09:14.5101688Z pass_due_to_skip 2025-09-07T08:09:14.5102048Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:16.4536535Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:09:16.4537320Z import pynvml # type: ignore[import] 2025-09-07T08:09:18.6737475Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:09:18.6738358Z from pkg_resources import resource_filename 2025-09-07T08:09:19.2375121Z 2025-09-07T08:09:19.9077460Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:09:20.0000042Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:09:20.0001374Z cpu eval BlenderbotSmallForCausalLM 2025-09-07T08:09:20.0002499Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:20.0420615Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:20.0812865Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:38.4977467Z pass 2025-09-07T08:09:38.4977912Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:40.8225869Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:09:40.8226797Z import pynvml # type: ignore[import] 2025-09-07T08:09:43.0478485Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:09:43.0479335Z from pkg_resources import resource_filename 2025-09-07T08:09:43.5816023Z 2025-09-07T08:09:44.4892543Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:09:44.4893078Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:09:44.4894117Z cpu eval BlenderbotSmallForConditionalGeneration 2025-09-07T08:09:44.6497029Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:44.7342647Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:09:44.8177427Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:15.0846447Z pass 2025-09-07T08:10:15.0846857Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:17.7132622Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:10:17.7133517Z import pynvml # type: ignore[import] 2025-09-07T08:10:19.9439257Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:10:19.9440313Z from pkg_resources import resource_filename 2025-09-07T08:10:20.4999223Z 2025-09-07T08:10:21.6622534Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:10:21.6622846Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:10:21.6623088Z cpu eval CamemBert 2025-09-07T08:10:21.8847677Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:21.9646432Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:22.0413303Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:43.7300782Z pass 2025-09-07T08:10:43.7301307Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:46.2319794Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:10:46.2320708Z import pynvml # type: ignore[import] 2025-09-07T08:10:48.4591122Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:10:48.4592132Z from pkg_resources import resource_filename 2025-09-07T08:10:49.0296232Z 2025-09-07T08:10:57.0042880Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:10:57.0043246Z loading model: 0it [00:07, ?it/s] 2025-09-07T08:10:57.0043512Z cpu eval DebertaV2ForMaskedLM 2025-09-07T08:10:57.2298419Z pass_due_to_skip 2025-09-07T08:10:57.2298864Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:10:58.9829749Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:10:58.9830650Z import pynvml # type: ignore[import] 2025-09-07T08:11:01.2165444Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:11:01.2166333Z from pkg_resources import resource_filename 2025-09-07T08:11:01.7546621Z 2025-09-07T08:11:08.5717414Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:11:08.5721011Z loading model: 0it [00:06, ?it/s] 2025-09-07T08:11:08.5723085Z cpu eval DebertaV2ForQuestionAnswering 2025-09-07T08:11:09.7462346Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:10.1504601Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:10.6132817Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:50.7758199Z pass 2025-09-07T08:11:50.7758615Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:53.7523560Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:11:53.7524764Z import pynvml # type: ignore[import] 2025-09-07T08:11:55.9809388Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:11:55.9810243Z from pkg_resources import resource_filename 2025-09-07T08:11:56.5052517Z 2025-09-07T08:11:57.2346431Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:11:57.2346817Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:11:57.2347118Z cpu eval DistilBertForMaskedLM 2025-09-07T08:11:57.3197676Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:57.3544138Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:11:57.3871108Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:13.8544595Z pass 2025-09-07T08:12:13.8545025Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:16.2271846Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:12:16.2272618Z import pynvml # type: ignore[import] 2025-09-07T08:12:18.4607656Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:12:18.4608538Z from pkg_resources import resource_filename 2025-09-07T08:12:19.0056997Z 2025-09-07T08:12:19.5507488Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:12:19.5507834Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:12:19.5508143Z cpu eval DistilBertForQuestionAnswering 2025-09-07T08:12:19.6143529Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:19.6447116Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:19.6747028Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:35.9586848Z pass 2025-09-07T08:12:35.9587274Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:37.6468871Z accuracy pass_rate=86.67% 2025-09-07T08:12:37.6470262Z calls_captured gmean=0.00x mean=472.667x 2025-09-07T08:12:37.6471334Z unique_graphs gmean=0.00x mean=1.133x 2025-09-07T08:12:37.6473653Z graph_breaks gmean=0.00x mean=0.267x 2025-09-07T08:12:37.6476041Z unique_graph_breaks gmean=0.00x mean=0.067x 2025-09-07T08:12:37.6478165Z autograd_captures gmean=0.00x mean=0.000x 2025-09-07T08:12:37.6480264Z autograd_compiles gmean=0.00x mean=0.000x 2025-09-07T08:12:37.6482393Z cudagraph_skips gmean=0.00x mean=0.000x 2025-09-07T08:12:37.6483208Z compilation_latency mean=23.085 seconds 2025-09-07T08:12:38.2222139Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *freezing_cudagraphs-true* ]] 2025-09-07T08:12:38.2223019Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *freeze_autotune_cudagraphs-true* ]] 2025-09-07T08:12:38.2224176Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *aotinductor-true* ]] 2025-09-07T08:12:38.2224820Z + [[ inference == \i\n\f\e\r\e\n\c\e ]] 2025-09-07T08:12:38.2225017Z + [[ accuracy == \a\c\c\u\r\a\c\y ]] 2025-09-07T08:12:38.2225854Z + taskset -c 0-93 python benchmarks/dynamo/huggingface.py --accuracy --no-translation-validation --freezing --inference --amp --export --disable-cudagraphs --device cpu --total-partitions 3 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inductor_export_huggingface_amp_inference_cpu_x86_accuracy.csv 2025-09-07T08:12:38.9028477Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:12:38.9029397Z import pynvml # type: ignore[import] 2025-09-07T08:12:41.1340331Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:12:41.1341207Z from pkg_resources import resource_filename 2025-09-07T08:12:42.3646182Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:12:42.3647032Z import pynvml # type: ignore[import] 2025-09-07T08:12:44.5878766Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:12:44.5879635Z from pkg_resources import resource_filename 2025-09-07T08:12:45.1235304Z 2025-09-07T08:12:46.8350179Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:12:46.8350507Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:12:46.8350758Z cpu eval AlbertForMaskedLM 2025-09-07T08:12:49.3713079Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:49.7045274Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:50.0374591Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:12:59.5395357Z pass 2025-09-07T08:12:59.5395783Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:01.1734677Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:13:01.1735677Z import pynvml # type: ignore[import] 2025-09-07T08:13:03.4035434Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:13:03.4036283Z from pkg_resources import resource_filename 2025-09-07T08:13:03.9574557Z 2025-09-07T08:13:05.6524099Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:13:05.6524419Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:13:05.6524675Z cpu eval AlbertForQuestionAnswering 2025-09-07T08:13:08.1703409Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:08.5017304Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:08.8308904Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:18.2964199Z pass 2025-09-07T08:13:18.2964711Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:19.9032271Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:13:19.9033081Z import pynvml # type: ignore[import] 2025-09-07T08:13:22.1336171Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:13:22.1337047Z from pkg_resources import resource_filename 2025-09-07T08:13:22.6755455Z 2025-09-07T08:13:24.4756696Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:13:24.4757001Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:13:24.4757285Z cpu eval AllenaiLongformerBase 2025-09-07T08:13:25.0020069Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:25.2560962Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:25.5065196Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:26.1686415Z ERROR:common: 2025-09-07T08:13:26.1686643Z Traceback (most recent call last): 2025-09-07T08:13:26.1687033Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 2320, in check_accuracy 2025-09-07T08:13:26.1687383Z optimized_model_iter_fn = optimize_ctx( 2025-09-07T08:13:26.1689530Z File "/var/lib/jenkins/workspace/benchmarks/dynamo/common.py", line 1523, in export 2025-09-07T08:13:26.1689838Z ep = torch.export.export( 2025-09-07T08:13:26.1690176Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/__init__.py", line 311, in export 2025-09-07T08:13:26.1690476Z raise e 2025-09-07T08:13:26.1690746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/__init__.py", line 277, in export 2025-09-07T08:13:26.1691046Z return _export( 2025-09-07T08:13:26.1691332Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 1163, in wrapper 2025-09-07T08:13:26.1691617Z raise e 2025-09-07T08:13:26.1691900Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 1129, in wrapper 2025-09-07T08:13:26.1692220Z ep = fn(*args, **kwargs) 2025-09-07T08:13:26.1692548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/exported_program.py", line 124, in wrapper 2025-09-07T08:13:26.1692890Z return fn(*args, **kwargs) 2025-09-07T08:13:26.1693175Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 2255, in _export 2025-09-07T08:13:26.1693479Z ep = _export_for_training( 2025-09-07T08:13:26.1693799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 1163, in wrapper 2025-09-07T08:13:26.1694105Z raise e 2025-09-07T08:13:26.1694375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 1129, in wrapper 2025-09-07T08:13:26.1694679Z ep = fn(*args, **kwargs) 2025-09-07T08:13:26.1695005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/exported_program.py", line 124, in wrapper 2025-09-07T08:13:26.1695341Z return fn(*args, **kwargs) 2025-09-07T08:13:26.1695963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 2071, in _export_for_training 2025-09-07T08:13:26.1698511Z export_artifact = export_func( 2025-09-07T08:13:26.1698841Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 1415, in _strict_export 2025-09-07T08:13:26.1699184Z gm_torch_level = _export_to_torch_ir( 2025-09-07T08:13:26.1699531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/export/_trace.py", line 812, in _export_to_torch_ir 2025-09-07T08:13:26.1699887Z gm_torch_level, _ = torch._dynamo.export( 2025-09-07T08:13:26.1700243Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/eval_frame.py", line 2002, in inner 2025-09-07T08:13:26.1700563Z result_traced = opt_f(*args, **kwargs) 2025-09-07T08:13:26.1700893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/eval_frame.py", line 414, in __call__ 2025-09-07T08:13:26.1701224Z return super().__call__(*args, **kwargs) 2025-09-07T08:13:26.1701592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1775, in _wrapped_call_impl 2025-09-07T08:13:26.1701949Z return self._call_impl(*args, **kwargs) 2025-09-07T08:13:26.1702274Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1702605Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1702948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/_dynamo/eval_frame.py", line 841, in compile_wrapper 2025-09-07T08:13:26.1703340Z raise e.with_traceback(None) from e.__cause__ # User compiler error 2025-09-07T08:13:26.1703864Z torch._dynamo.exc.UserError: Consider annotating your code using torch._check*(). Could not guard on data-dependent expression Eq(u0, 1) (unhinted: Eq(u0, 1)). (Size-like symbols: none) 2025-09-07T08:13:26.1704269Z 2025-09-07T08:13:26.1704772Z consider using data-dependent friendly APIs such as guard_or_false, guard_or_true and statically_known_trueCaused by: if is_global_attn: # transformers/models/longformer/modeling_longformer.py:554 in forward (_dynamo/variables/tensor.py:1435 in evaluate_expr) 2025-09-07T08:13:26.1705438Z For more information, run with TORCH_LOGS="dynamic" 2025-09-07T08:13:26.1705762Z For extended logs when we create symbols, also add TORCHDYNAMO_EXTENDED_DEBUG_CREATE_SYMBOL="u0" 2025-09-07T08:13:26.1706142Z If you suspect the guard was triggered from C++, add TORCHDYNAMO_EXTENDED_DEBUG_CPP=1 2025-09-07T08:13:26.1706639Z For more debugging help, see https://docs.google.com/document/d/1HSuTTVvYH1pTew89Rtpeu84Ht3nQEFTYhAX3Ypa_xJs/edit?usp=sharing 2025-09-07T08:13:26.1706951Z 2025-09-07T08:13:26.1707025Z User Stack (most recent call last): 2025-09-07T08:13:26.1707229Z (snipped, see stack below for prefix) 2025-09-07T08:13:26.1707637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1703, in forward 2025-09-07T08:13:26.1708045Z outputs = self.longformer( 2025-09-07T08:13:26.1708370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1708697Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1709090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1600, in forward 2025-09-07T08:13:26.1709480Z encoder_outputs = self.encoder( 2025-09-07T08:13:26.1709805Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1710128Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1710504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in forward 2025-09-07T08:13:26.1710893Z layer_outputs = layer_module( 2025-09-07T08:13:26.1711262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1711631Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1712009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:13:26.1712402Z self_attn_outputs = self.attention( 2025-09-07T08:13:26.1712732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1713056Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1713441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:13:26.1713819Z self_outputs = self.self( 2025-09-07T08:13:26.1714133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1714456Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1714845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 554, in forward 2025-09-07T08:13:26.1715221Z if is_global_attn: 2025-09-07T08:13:26.1715326Z 2025-09-07T08:13:26.1715439Z For C++ stack trace, run with TORCHDYNAMO_EXTENDED_DEBUG_CPP=1 2025-09-07T08:13:26.1715869Z For more information about this error, see: https://pytorch.org/docs/main/generated/exportdb/index.html#constrain-as-size-example 2025-09-07T08:13:26.1716182Z 2025-09-07T08:13:26.1716241Z from user code: 2025-09-07T08:13:26.1716606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1703, in forward 2025-09-07T08:13:26.1716984Z outputs = self.longformer( 2025-09-07T08:13:26.1717301Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1717628Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1718018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1600, in forward 2025-09-07T08:13:26.1718408Z encoder_outputs = self.encoder( 2025-09-07T08:13:26.1718721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1719044Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1719429Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in forward 2025-09-07T08:13:26.1719816Z layer_outputs = layer_module( 2025-09-07T08:13:26.1720135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1720452Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1720835Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:13:26.1721229Z self_attn_outputs = self.attention( 2025-09-07T08:13:26.1721560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1721875Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1722260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:13:26.1722647Z self_outputs = self.self( 2025-09-07T08:13:26.1722962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1786, in _call_impl 2025-09-07T08:13:26.1723284Z return forward_call(*args, **kwargs) 2025-09-07T08:13:26.1723658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 554, in forward 2025-09-07T08:13:26.1724034Z if is_global_attn: 2025-09-07T08:13:26.1724140Z 2025-09-07T08:13:26.1724502Z Set TORCHDYNAMO_VERBOSE=1 for the internal stack trace (please do this especially if you're reporting a bug to PyTorch). For even more developer context, set TORCH_LOGS="+dynamo" 2025-09-07T08:13:26.1724904Z 2025-09-07T08:13:26.1724907Z 2025-09-07T08:13:26.1725376Z The error above occurred when calling torch.export.export. If you would like to view some more information about this error, and get a list of all other errors that may occur in your export call, you can replace your `export()` call with `draft_export()`. 2025-09-07T08:13:26.1726015Z TorchDynamo optimized model failed to run because of following error 2025-09-07T08:13:26.2848013Z fail_to_run 2025-09-07T08:13:26.2848505Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:27.8231970Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:13:27.8232897Z import pynvml # type: ignore[import] 2025-09-07T08:13:30.0520566Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:13:30.0521522Z from pkg_resources import resource_filename 2025-09-07T08:13:30.6298058Z 2025-09-07T08:13:33.0126448Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:13:33.0127262Z loading model: 0it [00:02, ?it/s] 2025-09-07T08:13:33.0127620Z cpu eval BartForCausalLM 2025-09-07T08:13:33.6108738Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:33.7731559Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:33.9328405Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:43.3629387Z pass 2025-09-07T08:13:43.3629804Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:45.0943650Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:13:45.0944454Z import pynvml # type: ignore[import] 2025-09-07T08:13:47.3274507Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:13:47.3275390Z from pkg_resources import resource_filename 2025-09-07T08:13:48.0708273Z 2025-09-07T08:13:52.4492872Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:13:52.4493284Z loading model: 0it [00:04, ?it/s] 2025-09-07T08:13:52.4493641Z cpu eval BartForConditionalGeneration 2025-09-07T08:13:53.6839939Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:54.0035777Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:13:54.3301576Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:05.8575256Z pass 2025-09-07T08:14:05.8575703Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:07.5543758Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:14:07.5544762Z import pynvml # type: ignore[import] 2025-09-07T08:14:09.7834101Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:14:09.7835048Z from pkg_resources import resource_filename 2025-09-07T08:14:10.3977661Z 2025-09-07T08:14:11.4928105Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:14:11.4928548Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:14:11.4929056Z cpu eval BertForMaskedLM 2025-09-07T08:14:11.7093042Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:11.7852381Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:11.8604696Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:20.4162458Z pass 2025-09-07T08:14:20.4162870Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:21.9533602Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:14:21.9534492Z import pynvml # type: ignore[import] 2025-09-07T08:14:24.1799889Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:14:24.1800758Z from pkg_resources import resource_filename 2025-09-07T08:14:24.7578130Z 2025-09-07T08:14:25.6880631Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:14:25.6881036Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:14:25.6881336Z cpu eval BertForQuestionAnswering 2025-09-07T08:14:25.8630043Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:25.9322544Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:26.0009858Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:34.4485776Z pass 2025-09-07T08:14:34.4486193Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:36.0030293Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:14:36.0031195Z import pynvml # type: ignore[import] 2025-09-07T08:14:38.2285220Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:14:38.2286068Z from pkg_resources import resource_filename 2025-09-07T08:14:38.7624761Z 2025-09-07T08:14:56.7056001Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:14:56.7056339Z loading model: 0it [00:17, ?it/s] 2025-09-07T08:14:56.7056599Z cpu eval BlenderbotForCausalLM 2025-09-07T08:14:57.1261832Z pass_due_to_skip 2025-09-07T08:14:57.1262741Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:14:59.0855594Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:14:59.0856498Z import pynvml # type: ignore[import] 2025-09-07T08:15:01.3117024Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:15:01.3117900Z from pkg_resources import resource_filename 2025-09-07T08:15:01.8579034Z 2025-09-07T08:15:02.5332554Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:02.5333284Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:02.5333672Z cpu eval BlenderbotSmallForCausalLM 2025-09-07T08:15:02.6262677Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:02.6682153Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:02.7072475Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:10.8500673Z pass 2025-09-07T08:15:10.8501204Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:12.3752539Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:15:12.3753508Z import pynvml # type: ignore[import] 2025-09-07T08:15:14.5986592Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:15:14.5987449Z from pkg_resources import resource_filename 2025-09-07T08:15:15.1448258Z 2025-09-07T08:15:16.0455225Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:16.0455655Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:16.0456078Z cpu eval BlenderbotSmallForConditionalGeneration 2025-09-07T08:15:16.2095356Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:16.2946758Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:16.3781374Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:25.4929550Z pass 2025-09-07T08:15:25.4930054Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:27.0728464Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:15:27.0729372Z import pynvml # type: ignore[import] 2025-09-07T08:15:29.2968698Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:15:29.2969873Z from pkg_resources import resource_filename 2025-09-07T08:15:29.8404170Z 2025-09-07T08:15:31.0006125Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:31.0006433Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:15:31.0006677Z cpu eval CamemBert 2025-09-07T08:15:31.2284286Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:31.3068808Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:31.3831522Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:39.9470555Z pass 2025-09-07T08:15:39.9470980Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:41.5465588Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:15:41.5466497Z import pynvml # type: ignore[import] 2025-09-07T08:15:43.7711685Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:15:43.7712516Z from pkg_resources import resource_filename 2025-09-07T08:15:44.3479213Z 2025-09-07T08:15:52.2942525Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:15:52.2942836Z loading model: 0it [00:07, ?it/s] 2025-09-07T08:15:52.2943082Z cpu eval DebertaV2ForMaskedLM 2025-09-07T08:15:52.5178818Z pass_due_to_skip 2025-09-07T08:15:52.5179170Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:15:54.1860352Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:15:54.1861260Z import pynvml # type: ignore[import] 2025-09-07T08:15:56.4156459Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:15:56.4157311Z from pkg_resources import resource_filename 2025-09-07T08:15:56.9658756Z 2025-09-07T08:16:03.7842143Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:03.7845463Z loading model: 0it [00:06, ?it/s] 2025-09-07T08:16:03.7847994Z cpu eval DebertaV2ForQuestionAnswering 2025-09-07T08:16:04.9699417Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:05.3751020Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:05.8837808Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:18.5771403Z pass 2025-09-07T08:16:18.5771920Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:20.5145739Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:16:20.5146543Z import pynvml # type: ignore[import] 2025-09-07T08:16:22.7385625Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:16:22.7386658Z from pkg_resources import resource_filename 2025-09-07T08:16:23.3206878Z 2025-09-07T08:16:24.0185118Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:24.0185470Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:24.0185759Z cpu eval DistilBertForMaskedLM 2025-09-07T08:16:24.1029082Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:24.1368977Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:24.1689474Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:31.5797629Z pass 2025-09-07T08:16:31.5798137Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:33.1067883Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:16:33.1068746Z import pynvml # type: ignore[import] 2025-09-07T08:16:35.3353044Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:16:35.3353908Z from pkg_resources import resource_filename 2025-09-07T08:16:35.8930837Z 2025-09-07T08:16:36.4373412Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:36.4373800Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:36.4374121Z cpu eval DistilBertForQuestionAnswering 2025-09-07T08:16:36.5009858Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:36.5313266Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:36.5613050Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:44.5935951Z pass 2025-09-07T08:16:44.5936365Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:45.4693937Z accuracy pass_rate=80.00% 2025-09-07T08:16:45.4695698Z calls_captured gmean=0.00x mean=351.800x 2025-09-07T08:16:45.4698284Z unique_graphs gmean=0.00x mean=0.800x 2025-09-07T08:16:45.4700554Z graph_breaks gmean=0.00x mean=0.000x 2025-09-07T08:16:45.4703092Z unique_graph_breaks gmean=0.00x mean=0.000x 2025-09-07T08:16:45.4705539Z autograd_captures gmean=0.00x mean=0.000x 2025-09-07T08:16:45.4707866Z autograd_compiles gmean=0.00x mean=0.000x 2025-09-07T08:16:45.4710154Z cudagraph_skips gmean=0.00x mean=0.000x 2025-09-07T08:16:45.4710873Z compilation_latency mean=6.492 seconds 2025-09-07T08:16:46.0324146Z + taskset -c 0-93 python benchmarks/dynamo/huggingface.py --accuracy --no-translation-validation --freezing --inference --amp --export-aot-inductor --disable-cudagraphs --device cpu --total-partitions 3 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_accuracy.csv 2025-09-07T08:16:46.7104288Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:16:46.7105528Z import pynvml # type: ignore[import] 2025-09-07T08:16:48.9371658Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:16:48.9372834Z from pkg_resources import resource_filename 2025-09-07T08:16:50.2155542Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:16:50.2156313Z import pynvml # type: ignore[import] 2025-09-07T08:16:52.4347713Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:16:52.4348604Z from pkg_resources import resource_filename 2025-09-07T08:16:52.9731044Z 2025-09-07T08:16:54.6799528Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:16:54.6799855Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:16:54.6800114Z cpu eval AlbertForMaskedLM 2025-09-07T08:16:57.2046330Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:57.5402464Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:16:57.8735567Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:17:35.2312481Z pass 2025-09-07T08:17:35.2312914Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:17:37.8730501Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:17:37.8731509Z import pynvml # type: ignore[import] 2025-09-07T08:17:40.0970014Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:17:40.0970857Z from pkg_resources import resource_filename 2025-09-07T08:17:40.8692436Z 2025-09-07T08:17:42.5426173Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:17:42.5426502Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:17:42.5426799Z cpu eval AlbertForQuestionAnswering 2025-09-07T08:17:45.0916236Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:17:45.4197769Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:17:45.7477149Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:18.6494175Z pass 2025-09-07T08:18:18.6494592Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:21.3054546Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:18:21.3055304Z import pynvml # type: ignore[import] 2025-09-07T08:18:23.5335261Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:18:23.5336306Z from pkg_resources import resource_filename 2025-09-07T08:18:24.0706328Z 2025-09-07T08:18:26.4577442Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:18:26.4577765Z loading model: 0it [00:02, ?it/s] 2025-09-07T08:18:26.4578028Z cpu eval BartForCausalLM 2025-09-07T08:18:27.0555528Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:27.2170050Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:27.3713442Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:50.1590083Z pass 2025-09-07T08:18:50.1590551Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:18:52.3619716Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:18:52.3620491Z import pynvml # type: ignore[import] 2025-09-07T08:18:54.5885013Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:18:54.5885839Z from pkg_resources import resource_filename 2025-09-07T08:18:55.1221968Z 2025-09-07T08:18:59.5131035Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:18:59.5131471Z loading model: 0it [00:04, ?it/s] 2025-09-07T08:18:59.5131838Z cpu eval BartForConditionalGeneration 2025-09-07T08:19:00.7364536Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:01.0543675Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:01.3715949Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:34.4832781Z pass 2025-09-07T08:19:34.4833210Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:37.2123455Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:19:37.2124391Z import pynvml # type: ignore[import] 2025-09-07T08:19:39.4419616Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:19:39.4420491Z from pkg_resources import resource_filename 2025-09-07T08:19:40.0020747Z 2025-09-07T08:19:41.0940623Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:19:41.0940956Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:19:41.0941208Z cpu eval BertForMaskedLM 2025-09-07T08:19:41.3134639Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:41.3908309Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:19:41.4668192Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:04.4010480Z pass 2025-09-07T08:20:04.4010884Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:06.4926845Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:20:06.4927836Z import pynvml # type: ignore[import] 2025-09-07T08:20:08.7163009Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:20:08.7163864Z from pkg_resources import resource_filename 2025-09-07T08:20:09.2503989Z 2025-09-07T08:20:10.1782699Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:20:10.1783082Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:20:10.1783361Z cpu eval BertForQuestionAnswering 2025-09-07T08:20:10.3600673Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:10.4293216Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:10.4984849Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:27.1620416Z pass 2025-09-07T08:20:27.1620842Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:29.2034289Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:20:29.2035159Z import pynvml # type: ignore[import] 2025-09-07T08:20:31.4316406Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:20:31.4317246Z from pkg_resources import resource_filename 2025-09-07T08:20:31.9677680Z 2025-09-07T08:20:49.9309060Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:20:49.9309426Z loading model: 0it [00:17, ?it/s] 2025-09-07T08:20:49.9309777Z cpu eval BlenderbotForCausalLM 2025-09-07T08:20:50.3491843Z pass_due_to_skip 2025-09-07T08:20:50.3492243Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:52.2398625Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:20:52.2399551Z import pynvml # type: ignore[import] 2025-09-07T08:20:54.4711976Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:20:54.4712835Z from pkg_resources import resource_filename 2025-09-07T08:20:55.0285732Z 2025-09-07T08:20:55.7072415Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:20:55.7072921Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:20:55.7073362Z cpu eval BlenderbotSmallForCausalLM 2025-09-07T08:20:55.8005500Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:55.8425421Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:20:55.8814015Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:10.7713737Z pass 2025-09-07T08:21:10.7714165Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:12.7313422Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:21:12.7314346Z import pynvml # type: ignore[import] 2025-09-07T08:21:14.9484492Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:21:14.9485363Z from pkg_resources import resource_filename 2025-09-07T08:21:15.5048675Z 2025-09-07T08:21:16.4078993Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:21:16.4079315Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:21:16.4079600Z cpu eval BlenderbotSmallForConditionalGeneration 2025-09-07T08:21:16.5667485Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:16.6498170Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:16.7318302Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:38.1753397Z pass 2025-09-07T08:21:38.1753858Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:40.4615697Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:21:40.4616594Z import pynvml # type: ignore[import] 2025-09-07T08:21:42.6917887Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:21:42.6918729Z from pkg_resources import resource_filename 2025-09-07T08:21:43.2308892Z 2025-09-07T08:21:44.3928751Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:21:44.3929067Z loading model: 0it [00:01, ?it/s] 2025-09-07T08:21:44.3929357Z cpu eval CamemBert 2025-09-07T08:21:44.6134704Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:44.6940093Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:21:44.7709452Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:22:01.9129963Z pass 2025-09-07T08:22:01.9130388Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:22:03.9810510Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:22:03.9811420Z import pynvml # type: ignore[import] 2025-09-07T08:22:06.2121692Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:22:06.2122682Z from pkg_resources import resource_filename 2025-09-07T08:22:06.8464587Z 2025-09-07T08:22:14.8301069Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:22:14.8301403Z loading model: 0it [00:07, ?it/s] 2025-09-07T08:22:14.8303906Z cpu eval DebertaV2ForMaskedLM 2025-09-07T08:22:15.0508491Z pass_due_to_skip 2025-09-07T08:22:15.0508869Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:22:16.7293391Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:22:16.7294449Z import pynvml # type: ignore[import] 2025-09-07T08:22:18.9589312Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:22:18.9590171Z from pkg_resources import resource_filename 2025-09-07T08:22:19.5257943Z 2025-09-07T08:22:26.3375042Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:22:26.3377829Z loading model: 0it [00:06, ?it/s] 2025-09-07T08:22:26.3379702Z cpu eval DebertaV2ForQuestionAnswering 2025-09-07T08:22:27.5080192Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:22:27.9272874Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:22:28.4491163Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:13.1618179Z pass 2025-09-07T08:23:13.1618598Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:16.0514341Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:23:16.0515262Z import pynvml # type: ignore[import] 2025-09-07T08:23:18.2822798Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:23:18.2823684Z from pkg_resources import resource_filename 2025-09-07T08:23:18.8052253Z 2025-09-07T08:23:19.5025457Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:23:19.5025795Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:23:19.5026087Z cpu eval DistilBertForMaskedLM 2025-09-07T08:23:19.5879912Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:19.6221332Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:19.6548895Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:33.3655983Z pass 2025-09-07T08:23:33.3656393Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:35.2533963Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:23:35.2534892Z import pynvml # type: ignore[import] 2025-09-07T08:23:37.4784920Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:23:37.4785760Z from pkg_resources import resource_filename 2025-09-07T08:23:38.0049706Z 2025-09-07T08:23:38.5461353Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:23:38.5461713Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:23:38.5462018Z cpu eval DistilBertForQuestionAnswering 2025-09-07T08:23:38.6094244Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:38.6403800Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:38.6708710Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:51.9682491Z pass 2025-09-07T08:23:51.9682924Z WARNING:common:Trying to call the empty_gpu_cache for device: cpu, which is not in list [cuda, xpu] 2025-09-07T08:23:53.1798568Z accuracy pass_rate=85.71% 2025-09-07T08:23:53.1801848Z calls_captured gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1804082Z unique_graphs gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1806471Z graph_breaks gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1808558Z unique_graph_breaks gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1810650Z autograd_captures gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1812687Z autograd_compiles gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1815063Z cudagraph_skips gmean=0.00x mean=0.000x 2025-09-07T08:23:53.1815505Z compilation_latency mean=0.000 seconds 2025-09-07T08:23:53.7633227Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *maxautotune-true* ]] 2025-09-07T08:23:53.7634093Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *cudagraphs_low_precision-true* ]] 2025-09-07T08:23:53.7634604Z + for target in "${targets[@]}" 2025-09-07T08:23:53.7634799Z + target_flag=('--performance') 2025-09-07T08:23:53.7634985Z + local target_flag 2025-09-07T08:23:53.7635165Z + [[ performance == \p\e\r\f\o\r\m\a\n\c\e ]] 2025-09-07T08:23:53.7635385Z + target_flag+=(--cold-start-latency) 2025-09-07T08:23:53.7635869Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *freezing-true* ]] 2025-09-07T08:23:53.7636334Z + target_flag+=(--freezing) 2025-09-07T08:23:53.7636810Z + [[ training-false-inference-true-default-true-dynamic-true-cppwrapper-true-aotinductor-true-freezing-true == *default-true* ]] 2025-09-07T08:23:53.7637948Z + taskset -c 0-93 python benchmarks/dynamo/huggingface.py --performance --cold-start-latency --freezing --inference --amp --backend inductor --disable-cudagraphs --device cpu --total-partitions 3 --partition-id 0 --output /var/lib/jenkins/workspace/test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance.csv 2025-09-07T08:23:54.4460091Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:23:54.4460866Z import pynvml # type: ignore[import] 2025-09-07T08:23:56.6728382Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:23:56.6729353Z from pkg_resources import resource_filename 2025-09-07T08:23:57.9083811Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/cuda/__init__.py:63: FutureWarning: The pynvml package is deprecated. Please install nvidia-ml-py instead. If you did not install pynvml directly, please report this to the maintainers of the package that installed pynvml for you. 2025-09-07T08:23:57.9084702Z import pynvml # type: ignore[import] 2025-09-07T08:24:00.1393630Z /opt/conda/envs/py_3.9/lib/python3.9/site-packages/librosa/util/files.py:10: UserWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html. The pkg_resources package is slated for removal as early as 2025-11-30. Refrain from using this package or pin to Setuptools<81. 2025-09-07T08:24:00.1394494Z from pkg_resources import resource_filename 2025-09-07T08:24:00.6822229Z 2025-09-07T08:24:03.0792051Z loading model: 0it [00:00, ?it/s] 2025-09-07T08:24:03.0792599Z loading model: 0it [00:02, ?it/s] 2025-09-07T08:24:03.0792940Z cpu eval AlbertForMaskedLM 2025-09-07T08:24:33.9176161Z 2025-09-07T08:24:34.9606180Z running benchmark: 0% 0/30 [00:00bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0089518Z 2025-09-07T08:41:32.0089618Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0090131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0090587Z layer_outputs = layer_module( 2025-09-07T08:41:32.0090931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0091282Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0091674Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0092082Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0092476Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0092871Z self_outputs = self.self( 2025-09-07T08:41:32.0093251Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0093661Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0094138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0094696Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0094930Z 2025-09-07T08:41:32.0095038Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0095528Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0095982Z layer_outputs = layer_module( 2025-09-07T08:41:32.0096310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0096651Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0097044Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0097433Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0097820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0098210Z self_outputs = self.self( 2025-09-07T08:41:32.0098584Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0098999Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0099459Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0100007Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0100242Z 2025-09-07T08:41:32.0100319Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0100528Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0100726Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0100913Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0101181Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0101397Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0101621Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0102098Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0102556Z layer_outputs = layer_module( 2025-09-07T08:41:32.0102883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0103231Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0103629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0104019Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0104417Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0104807Z self_outputs = self.self( 2025-09-07T08:41:32.0105186Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0105580Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0105695Z 2025-09-07T08:41:32.0105796Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0106284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0106744Z layer_outputs = layer_module( 2025-09-07T08:41:32.0107074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0107419Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0107808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0108204Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0108598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0108990Z self_outputs = self.self( 2025-09-07T08:41:32.0109361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0109759Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0109897Z 2025-09-07T08:41:32.0109995Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0110486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0110944Z layer_outputs = layer_module( 2025-09-07T08:41:32.0111266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0111613Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0112015Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0112409Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0112798Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0113175Z self_outputs = self.self( 2025-09-07T08:41:32.0113550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0113952Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0114084Z 2025-09-07T08:41:32.0114189Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0114708Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0115193Z layer_outputs = layer_module( 2025-09-07T08:41:32.0115526Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0115874Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0116271Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0116664Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0117054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0117442Z self_outputs = self.self( 2025-09-07T08:41:32.0117823Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0118258Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0118753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0119314Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0119721Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0120062Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0120209Z 2025-09-07T08:41:32.0120314Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0120795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0121258Z layer_outputs = layer_module( 2025-09-07T08:41:32.0121590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0121939Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0122336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0122721Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0123114Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0123505Z self_outputs = self.self( 2025-09-07T08:41:32.0123884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0124324Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0124818Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0125342Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0125827Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0126277Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0126603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0126932Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0127089Z 2025-09-07T08:41:32.0127187Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0127677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0128848Z layer_outputs = layer_module( 2025-09-07T08:41:32.0129217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0129558Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0129964Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0130364Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0130766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0131164Z self_outputs = self.self( 2025-09-07T08:41:32.0131537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0131967Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0132464Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0132999Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0133193Z 2025-09-07T08:41:32.0133298Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0133778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0134237Z layer_outputs = layer_module( 2025-09-07T08:41:32.0134568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0134911Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0135299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0135692Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0136089Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0136476Z self_outputs = self.self( 2025-09-07T08:41:32.0136855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0137276Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0137769Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0138297Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0138489Z 2025-09-07T08:41:32.0138596Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0139076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0139527Z layer_outputs = layer_module( 2025-09-07T08:41:32.0139852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0140195Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0140586Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0140976Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0141359Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0141741Z self_outputs = self.self( 2025-09-07T08:41:32.0142155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0142683Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0142908Z 2025-09-07T08:41:32.0143015Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0143486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0143943Z layer_outputs = layer_module( 2025-09-07T08:41:32.0144269Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0144610Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0145001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0145384Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0145778Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0146211Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0146637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0147038Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0147168Z 2025-09-07T08:41:32.0147247Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0147452Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0147652Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0147848Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0148064Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0148551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0149013Z layer_outputs = layer_module( 2025-09-07T08:41:32.0149342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0149684Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0150068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0150464Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0150856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0151244Z self_outputs = self.self( 2025-09-07T08:41:32.0151613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0152013Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0152154Z 2025-09-07T08:41:32.0152250Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0152743Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0153198Z layer_outputs = layer_module( 2025-09-07T08:41:32.0153518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0153858Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0154253Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0154649Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0155039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0155469Z self_outputs = self.self( 2025-09-07T08:41:32.0155869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0156290Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0156421Z 2025-09-07T08:41:32.0156528Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0157008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0157455Z layer_outputs = layer_module( 2025-09-07T08:41:32.0157779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0158121Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0158513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0158914Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0159297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0159684Z self_outputs = self.self( 2025-09-07T08:41:32.0160059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0160482Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0160949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0161507Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0161745Z 2025-09-07T08:41:32.0161845Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0162334Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0163211Z layer_outputs = layer_module( 2025-09-07T08:41:32.0163545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0163886Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0164284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0164684Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0165084Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0165475Z self_outputs = self.self( 2025-09-07T08:41:32.0165850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0166251Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0166378Z 2025-09-07T08:41:32.0166484Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0166963Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0167411Z layer_outputs = layer_module( 2025-09-07T08:41:32.0167738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0168080Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0168473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0168867Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0169287Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0169710Z self_outputs = self.self( 2025-09-07T08:41:32.0170086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0170510Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0170982Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0171504Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0171980Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0172373Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0172498Z 2025-09-07T08:41:32.0172604Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0173097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0173549Z layer_outputs = layer_module( 2025-09-07T08:41:32.0173883Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0174230Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0174626Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0175022Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0175410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0175797Z self_outputs = self.self( 2025-09-07T08:41:32.0176180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0176599Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0177061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0177620Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0177854Z 2025-09-07T08:41:32.0177952Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0178440Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0178898Z layer_outputs = layer_module( 2025-09-07T08:41:32.0179235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0179585Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0179988Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0180386Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0180786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0181224Z self_outputs = self.self( 2025-09-07T08:41:32.0181607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0182032Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0182508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0183137Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0183413Z 2025-09-07T08:41:32.0183522Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0184003Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0184465Z layer_outputs = layer_module( 2025-09-07T08:41:32.0184801Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0185158Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0185550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0185978Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0186364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0186743Z self_outputs = self.self( 2025-09-07T08:41:32.0187109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0187513Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0187975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0188513Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0188734Z 2025-09-07T08:41:32.0188819Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0189015Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0189227Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0189706Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0190157Z layer_outputs = layer_module( 2025-09-07T08:41:32.0190480Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0190815Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0191191Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0191570Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0191955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0192334Z self_outputs = self.self( 2025-09-07T08:41:32.0192691Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0193067Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0193189Z 2025-09-07T08:41:32.0193285Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0193757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0194199Z layer_outputs = layer_module( 2025-09-07T08:41:32.0194509Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0194842Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0195227Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0195609Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0195995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0196394Z self_outputs = self.self( 2025-09-07T08:41:32.0196784Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0197206Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0197329Z 2025-09-07T08:41:32.0197431Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0197901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0198336Z layer_outputs = layer_module( 2025-09-07T08:41:32.0198656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0198991Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0199375Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0199766Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0200142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0200519Z self_outputs = self.self( 2025-09-07T08:41:32.0200889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0201278Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0201405Z 2025-09-07T08:41:32.0201501Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0201971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0202413Z layer_outputs = layer_module( 2025-09-07T08:41:32.0202734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0203069Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0203447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0203831Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0204218Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0204595Z self_outputs = self.self( 2025-09-07T08:41:32.0204958Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0205368Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0205849Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0206387Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0206781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0207106Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0207247Z 2025-09-07T08:41:32.0207341Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0207812Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0208260Z layer_outputs = layer_module( 2025-09-07T08:41:32.0208585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0208918Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0209330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0209749Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0210134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0210511Z self_outputs = self.self( 2025-09-07T08:41:32.0210875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0211297Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0211783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0212287Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0212756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0213193Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0213502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0213826Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0213974Z 2025-09-07T08:41:32.0214071Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0214538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0214984Z layer_outputs = layer_module( 2025-09-07T08:41:32.0215300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0215641Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0216036Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0216422Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0216802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0217184Z self_outputs = self.self( 2025-09-07T08:41:32.0217557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0217982Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0218466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0218979Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0219179Z 2025-09-07T08:41:32.0219277Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0219753Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0220197Z layer_outputs = layer_module( 2025-09-07T08:41:32.0220517Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0220848Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0221235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0221618Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0222002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0222379Z self_outputs = self.self( 2025-09-07T08:41:32.0222766Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0223226Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0223714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0224241Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0224432Z 2025-09-07T08:41:32.0224533Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0225009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0225465Z layer_outputs = layer_module( 2025-09-07T08:41:32.0225797Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0226141Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0226540Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0226928Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0227321Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0227704Z self_outputs = self.self( 2025-09-07T08:41:32.0228078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0228571Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0228792Z 2025-09-07T08:41:32.0228886Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0229370Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0229829Z layer_outputs = layer_module( 2025-09-07T08:41:32.0230156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0230499Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0230882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0231278Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0231669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0232097Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0232515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0232919Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0233059Z 2025-09-07T08:41:32.0233134Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0233337Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0233535Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0233722Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0233943Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0234425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0234884Z layer_outputs = layer_module( 2025-09-07T08:41:32.0235205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0235552Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0235979Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0236399Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0236789Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0237169Z self_outputs = self.self( 2025-09-07T08:41:32.0237544Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0237944Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0238076Z 2025-09-07T08:41:32.0238182Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0238656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0239099Z layer_outputs = layer_module( 2025-09-07T08:41:32.0239426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0239768Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0240160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0240548Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0240934Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0241317Z self_outputs = self.self( 2025-09-07T08:41:32.0241692Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0242092Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0242221Z 2025-09-07T08:41:32.0242322Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0242811Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0243278Z layer_outputs = layer_module( 2025-09-07T08:41:32.0243616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0243963Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0244361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0244761Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0245161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0245558Z self_outputs = self.self( 2025-09-07T08:41:32.0245941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0246367Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0246852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0247414Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0247649Z 2025-09-07T08:41:32.0247760Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0248257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0248718Z layer_outputs = layer_module( 2025-09-07T08:41:32.0249060Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0249521Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0249939Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0250338Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0250728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0251119Z self_outputs = self.self( 2025-09-07T08:41:32.0251503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0251903Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0252033Z 2025-09-07T08:41:32.0252138Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0252621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0253083Z layer_outputs = layer_module( 2025-09-07T08:41:32.0253418Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0253761Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0254154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0254554Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0254949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0255338Z self_outputs = self.self( 2025-09-07T08:41:32.0255717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0256139Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0256616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0257150Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0257627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0258022Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0258148Z 2025-09-07T08:41:32.0258247Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0258729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0259191Z layer_outputs = layer_module( 2025-09-07T08:41:32.0259525Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0259873Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0260265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0260664Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0261061Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0261450Z self_outputs = self.self( 2025-09-07T08:41:32.0261828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0262240Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0262714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0263318Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0263565Z 2025-09-07T08:41:32.0263670Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0264158Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0264609Z layer_outputs = layer_module( 2025-09-07T08:41:32.0264936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0265279Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0265678Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0266076Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0266463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0266856Z self_outputs = self.self( 2025-09-07T08:41:32.0267238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0267658Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0268131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0268677Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0268915Z 2025-09-07T08:41:32.0269014Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0269503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0269971Z layer_outputs = layer_module( 2025-09-07T08:41:32.0270305Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0270642Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0271042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0271440Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0271846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0272230Z self_outputs = self.self( 2025-09-07T08:41:32.0272606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0273023Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0273497Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0274058Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0274285Z 2025-09-07T08:41:32.0274369Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0274567Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0274788Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0275275Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0275732Z layer_outputs = layer_module( 2025-09-07T08:41:32.0276056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0276396Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0276845Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0277255Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0277649Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0278028Z self_outputs = self.self( 2025-09-07T08:41:32.0278406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0278800Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0278912Z 2025-09-07T08:41:32.0279019Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0279504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0279958Z layer_outputs = layer_module( 2025-09-07T08:41:32.0280290Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0280636Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0281077Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0281473Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0281866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0282253Z self_outputs = self.self( 2025-09-07T08:41:32.0282634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0283076Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0283206Z 2025-09-07T08:41:32.0283309Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0283807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0284279Z layer_outputs = layer_module( 2025-09-07T08:41:32.0284617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0284968Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0285364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0285767Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0286165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0286562Z self_outputs = self.self( 2025-09-07T08:41:32.0286947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0287349Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0287488Z 2025-09-07T08:41:32.0287597Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0288081Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0288532Z layer_outputs = layer_module( 2025-09-07T08:41:32.0288856Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0289188Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0289577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0289968Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0290408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0290829Z self_outputs = self.self( 2025-09-07T08:41:32.0291197Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0291617Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0292105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0292653Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0293043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0293374Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0293524Z 2025-09-07T08:41:32.0293628Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0294107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0294555Z layer_outputs = layer_module( 2025-09-07T08:41:32.0294873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0295212Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0295599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0295981Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0296365Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0296739Z self_outputs = self.self( 2025-09-07T08:41:32.0297109Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0297530Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0298017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0298516Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0298978Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0299412Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0299730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0300056Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0300197Z 2025-09-07T08:41:32.0300299Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0300768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0301213Z layer_outputs = layer_module( 2025-09-07T08:41:32.0301536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0301868Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0302246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0302634Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0303019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0303449Z self_outputs = self.self( 2025-09-07T08:41:32.0303836Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0304269Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0304764Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0305291Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0305491Z 2025-09-07T08:41:32.0305590Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0306071Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0306527Z layer_outputs = layer_module( 2025-09-07T08:41:32.0306851Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0307200Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0307596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0307990Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0308372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0308756Z self_outputs = self.self( 2025-09-07T08:41:32.0309132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0309556Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0310048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0310567Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0310766Z 2025-09-07T08:41:32.0310864Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0311342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0311793Z layer_outputs = layer_module( 2025-09-07T08:41:32.0312122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0312460Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0312854Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0313245Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0313639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0314031Z self_outputs = self.self( 2025-09-07T08:41:32.0314406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0314913Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0315147Z 2025-09-07T08:41:32.0315250Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0315737Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0316197Z layer_outputs = layer_module( 2025-09-07T08:41:32.0316518Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0316896Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0317338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0317739Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0318130Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0318545Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0318966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0319369Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0319498Z 2025-09-07T08:41:32.0319584Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0319915Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0320144Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0320340Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0320561Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0321035Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0321490Z layer_outputs = layer_module( 2025-09-07T08:41:32.0321802Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0322140Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0322531Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0322915Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0323296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0323672Z self_outputs = self.self( 2025-09-07T08:41:32.0324042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0324433Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0324559Z 2025-09-07T08:41:32.0324661Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0325129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0325565Z layer_outputs = layer_module( 2025-09-07T08:41:32.0325884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0326220Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0326611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0326987Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0327368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0327746Z self_outputs = self.self( 2025-09-07T08:41:32.0328113Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0328504Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0328631Z 2025-09-07T08:41:32.0328724Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0329190Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0329635Z layer_outputs = layer_module( 2025-09-07T08:41:32.0329991Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0330354Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0330735Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0331123Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0331508Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0331889Z self_outputs = self.self( 2025-09-07T08:41:32.0332257Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0332660Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0333118Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0333663Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0333890Z 2025-09-07T08:41:32.0333994Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0334469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0334909Z layer_outputs = layer_module( 2025-09-07T08:41:32.0335246Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0335584Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0335974Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0336355Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0336748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0337135Z self_outputs = self.self( 2025-09-07T08:41:32.0337504Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0337892Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0338017Z 2025-09-07T08:41:32.0338111Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0338585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0339034Z layer_outputs = layer_module( 2025-09-07T08:41:32.0339354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0339693Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0340076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0340465Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0340853Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0341236Z self_outputs = self.self( 2025-09-07T08:41:32.0341597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0342010Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0342470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0342986Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0343473Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0343885Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0344006Z 2025-09-07T08:41:32.0344102Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0344572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0345021Z layer_outputs = layer_module( 2025-09-07T08:41:32.0345343Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0345664Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0346052Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0346434Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0346819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0347203Z self_outputs = self.self( 2025-09-07T08:41:32.0347561Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0347967Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0348424Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0348959Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0349182Z 2025-09-07T08:41:32.0349284Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0349746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0350191Z layer_outputs = layer_module( 2025-09-07T08:41:32.0350513Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0350852Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0351236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0351614Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0352000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0352378Z self_outputs = self.self( 2025-09-07T08:41:32.0352746Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0353156Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0353606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0354145Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0354375Z 2025-09-07T08:41:32.0354473Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0354941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0355384Z layer_outputs = layer_module( 2025-09-07T08:41:32.0355697Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0356031Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0356466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0356891Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0357279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0357651Z self_outputs = self.self( 2025-09-07T08:41:32.0358021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0358434Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0358898Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0359435Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0359659Z 2025-09-07T08:41:32.0359740Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0359948Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0360178Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0360655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0361096Z layer_outputs = layer_module( 2025-09-07T08:41:32.0361416Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0361750Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0362134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0362516Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0362899Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0363281Z self_outputs = self.self( 2025-09-07T08:41:32.0363656Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0364040Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0364152Z 2025-09-07T08:41:32.0364254Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0364720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0365163Z layer_outputs = layer_module( 2025-09-07T08:41:32.0365486Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0365820Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0366207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0366586Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0366969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0367350Z self_outputs = self.self( 2025-09-07T08:41:32.0367720Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0368101Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0368235Z 2025-09-07T08:41:32.0368333Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0368807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0369249Z layer_outputs = layer_module( 2025-09-07T08:41:32.0369608Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0369976Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0370374Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0370768Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0371160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0371544Z self_outputs = self.self( 2025-09-07T08:41:32.0371910Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0372315Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0372454Z 2025-09-07T08:41:32.0372553Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0373034Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0373488Z layer_outputs = layer_module( 2025-09-07T08:41:32.0373807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0374148Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0374543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0374932Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0375318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0375706Z self_outputs = self.self( 2025-09-07T08:41:32.0376086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0376512Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0377008Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0377555Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0377948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0378283Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0378433Z 2025-09-07T08:41:32.0378533Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0379017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0379465Z layer_outputs = layer_module( 2025-09-07T08:41:32.0379799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0380143Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0380541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0380958Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0381337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0381714Z self_outputs = self.self( 2025-09-07T08:41:32.0382082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0382505Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0383051Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0383605Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0384076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0384513Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0384839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0385166Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0385305Z 2025-09-07T08:41:32.0385401Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0385875Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0386329Z layer_outputs = layer_module( 2025-09-07T08:41:32.0386661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0387003Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0387381Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0387770Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0388160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0388542Z self_outputs = self.self( 2025-09-07T08:41:32.0388904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0389328Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0389819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0390343Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0390533Z 2025-09-07T08:41:32.0390636Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0391111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0391554Z layer_outputs = layer_module( 2025-09-07T08:41:32.0391876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0392212Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0392598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0392982Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0393368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0393751Z self_outputs = self.self( 2025-09-07T08:41:32.0394119Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0394536Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0395009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0395522Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0395719Z 2025-09-07T08:41:32.0395815Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0396327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0396819Z layer_outputs = layer_module( 2025-09-07T08:41:32.0397132Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0397469Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0397857Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0398245Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0398627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0398999Z self_outputs = self.self( 2025-09-07T08:41:32.0399368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0399859Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0400076Z 2025-09-07T08:41:32.0400180Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0400654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0401088Z layer_outputs = layer_module( 2025-09-07T08:41:32.0401409Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0401747Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0402133Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0402514Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0402893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0403311Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0403724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0404120Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0404248Z 2025-09-07T08:41:32.0404332Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0404524Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0404720Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0404914Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0405136Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0405604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0406061Z layer_outputs = layer_module( 2025-09-07T08:41:32.0406382Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0406715Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0407102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0407482Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0407866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0408244Z self_outputs = self.self( 2025-09-07T08:41:32.0408611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0408996Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0409132Z 2025-09-07T08:41:32.0409260Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0409775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0410217Z layer_outputs = layer_module( 2025-09-07T08:41:32.0410538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0410866Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0411252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0411637Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0412021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0412395Z self_outputs = self.self( 2025-09-07T08:41:32.0412756Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0413149Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0413283Z 2025-09-07T08:41:32.0413377Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0413842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0414286Z layer_outputs = layer_module( 2025-09-07T08:41:32.0414596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0414930Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0415314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0415699Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0416087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0416456Z self_outputs = self.self( 2025-09-07T08:41:32.0416829Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0417236Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0417693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0418234Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0418457Z 2025-09-07T08:41:32.0418551Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0419026Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0419474Z layer_outputs = layer_module( 2025-09-07T08:41:32.0419795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0420127Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0420503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0420888Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0421270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0421649Z self_outputs = self.self( 2025-09-07T08:41:32.0422005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0422426Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0422578Z 2025-09-07T08:41:32.0422695Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0423165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0423612Z layer_outputs = layer_module( 2025-09-07T08:41:32.0423925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0424268Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0424658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0425043Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0425428Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0425803Z self_outputs = self.self( 2025-09-07T08:41:32.0426174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0426582Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0427045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0427554Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0428009Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0428392Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0428525Z 2025-09-07T08:41:32.0428622Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0429097Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0429547Z layer_outputs = layer_module( 2025-09-07T08:41:32.0429864Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0430202Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0430590Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0430972Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0431345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0431725Z self_outputs = self.self( 2025-09-07T08:41:32.0432093Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0432502Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0432960Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0433492Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0433725Z 2025-09-07T08:41:32.0433821Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0434295Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0434739Z layer_outputs = layer_module( 2025-09-07T08:41:32.0435062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0435389Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0435830Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0436238Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0436625Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0437006Z self_outputs = self.self( 2025-09-07T08:41:32.0437367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0437776Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0438236Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0438778Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0439004Z 2025-09-07T08:41:32.0439109Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0439579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0440034Z layer_outputs = layer_module( 2025-09-07T08:41:32.0440361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0440698Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0441086Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0441470Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0441861Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0442247Z self_outputs = self.self( 2025-09-07T08:41:32.0442619Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0443031Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0443484Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0444024Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0444252Z 2025-09-07T08:41:32.0444326Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0444532Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0444756Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0445230Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0445681Z layer_outputs = layer_module( 2025-09-07T08:41:32.0446007Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0446345Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0446727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0447120Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0447507Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0447890Z self_outputs = self.self( 2025-09-07T08:41:32.0448263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0448639Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0448760Z 2025-09-07T08:41:32.0448920Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0449410Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0449853Z layer_outputs = layer_module( 2025-09-07T08:41:32.0450174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0450505Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0450892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0451278Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0451665Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0452044Z self_outputs = self.self( 2025-09-07T08:41:32.0452411Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0452804Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0452936Z 2025-09-07T08:41:32.0453032Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0453502Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0453954Z layer_outputs = layer_module( 2025-09-07T08:41:32.0454270Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0454614Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0455001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0455388Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0455765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0456148Z self_outputs = self.self( 2025-09-07T08:41:32.0456522Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0456918Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0457048Z 2025-09-07T08:41:32.0457152Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0457617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0458068Z layer_outputs = layer_module( 2025-09-07T08:41:32.0458397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0458736Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0459127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0459502Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0459891Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0460266Z self_outputs = self.self( 2025-09-07T08:41:32.0460633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0461055Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0461537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0462122Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0462555Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0462897Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0463044Z 2025-09-07T08:41:32.0463153Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0463643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0464106Z layer_outputs = layer_module( 2025-09-07T08:41:32.0464441Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0464793Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0465187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0465594Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0465995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0466390Z self_outputs = self.self( 2025-09-07T08:41:32.0466771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0467201Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0467710Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0468232Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0468722Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0469178Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0469503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0469838Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0469989Z 2025-09-07T08:41:32.0470088Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0470581Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0471043Z layer_outputs = layer_module( 2025-09-07T08:41:32.0471369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0471722Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0472128Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0472535Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0472940Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0473325Z self_outputs = self.self( 2025-09-07T08:41:32.0473712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0474152Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0474659Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0475198Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0475395Z 2025-09-07T08:41:32.0475497Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0476043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0476511Z layer_outputs = layer_module( 2025-09-07T08:41:32.0476837Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0477180Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0477556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0477939Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0478325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0478706Z self_outputs = self.self( 2025-09-07T08:41:32.0479070Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0479491Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0479975Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0480492Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0480680Z 2025-09-07T08:41:32.0480783Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0481285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0481725Z layer_outputs = layer_module( 2025-09-07T08:41:32.0482048Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0482388Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0482776Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0483157Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0483545Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0483925Z self_outputs = self.self( 2025-09-07T08:41:32.0484296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0484777Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0484995Z 2025-09-07T08:41:32.0485094Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0485569Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0486019Z layer_outputs = layer_module( 2025-09-07T08:41:32.0486342Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0486676Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0487054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0487434Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0487817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0488234Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0488644Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0489112Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0489276Z 2025-09-07T08:41:32.0489351Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0489553Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0489753Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0489941Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0490162Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0490641Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0491098Z layer_outputs = layer_module( 2025-09-07T08:41:32.0491426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0491758Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0492154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0492542Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0492929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0493316Z self_outputs = self.self( 2025-09-07T08:41:32.0493680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0494076Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0494211Z 2025-09-07T08:41:32.0494309Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0494783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0495233Z layer_outputs = layer_module( 2025-09-07T08:41:32.0495553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0495891Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0496276Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0496659Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0497037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0497416Z self_outputs = self.self( 2025-09-07T08:41:32.0497791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0498189Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0498319Z 2025-09-07T08:41:32.0498427Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0498895Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0499347Z layer_outputs = layer_module( 2025-09-07T08:41:32.0499670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0500007Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0500396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0500776Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0501161Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0501540Z self_outputs = self.self( 2025-09-07T08:41:32.0501962Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0502398Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0502873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0503414Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0503645Z 2025-09-07T08:41:32.0503740Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0504213Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0504659Z layer_outputs = layer_module( 2025-09-07T08:41:32.0504972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0505310Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0505699Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0506099Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0506493Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0506873Z self_outputs = self.self( 2025-09-07T08:41:32.0507249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0507646Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0507771Z 2025-09-07T08:41:32.0507877Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0508354Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0508820Z layer_outputs = layer_module( 2025-09-07T08:41:32.0509152Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0509497Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0509897Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0510293Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0510679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0511064Z self_outputs = self.self( 2025-09-07T08:41:32.0511437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0511849Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0512307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0512822Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0513280Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0513664Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0513785Z 2025-09-07T08:41:32.0513890Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0514355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0514804Z layer_outputs = layer_module( 2025-09-07T08:41:32.0515126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0515514Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0515917Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0516295Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0516679Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0517062Z self_outputs = self.self( 2025-09-07T08:41:32.0517431Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0517842Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0518291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0518836Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0519073Z 2025-09-07T08:41:32.0519169Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0519643Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0520088Z layer_outputs = layer_module( 2025-09-07T08:41:32.0520405Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0520741Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0521126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0521506Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0521885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0522267Z self_outputs = self.self( 2025-09-07T08:41:32.0522636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0523041Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0523498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0524027Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0524256Z 2025-09-07T08:41:32.0524351Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0524828Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0525272Z layer_outputs = layer_module( 2025-09-07T08:41:32.0525596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0525924Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0526316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0526700Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0527085Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0527465Z self_outputs = self.self( 2025-09-07T08:41:32.0527825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0528233Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0528727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0529291Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0529512Z 2025-09-07T08:41:32.0529595Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0529789Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0530014Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0530488Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0530931Z layer_outputs = layer_module( 2025-09-07T08:41:32.0531252Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0531578Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0531967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0532354Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0532736Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0533105Z self_outputs = self.self( 2025-09-07T08:41:32.0533470Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0533850Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0533962Z 2025-09-07T08:41:32.0534067Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0534538Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0534975Z layer_outputs = layer_module( 2025-09-07T08:41:32.0535302Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0535643Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0536028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0536410Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0536783Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0537161Z self_outputs = self.self( 2025-09-07T08:41:32.0537532Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0537916Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0538038Z 2025-09-07T08:41:32.0538141Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0538606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0539054Z layer_outputs = layer_module( 2025-09-07T08:41:32.0539377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0539711Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0540091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0540463Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0540842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0541217Z self_outputs = self.self( 2025-09-07T08:41:32.0541614Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0542041Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0542178Z 2025-09-07T08:41:32.0542273Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0542747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0543193Z layer_outputs = layer_module( 2025-09-07T08:41:32.0543516Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0543847Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0544237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0544620Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0545005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0545387Z self_outputs = self.self( 2025-09-07T08:41:32.0545750Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0546175Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0546670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0547210Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0547617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0547942Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0548095Z 2025-09-07T08:41:32.0548195Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0548669Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0549119Z layer_outputs = layer_module( 2025-09-07T08:41:32.0549442Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0549772Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0550160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0550545Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0550933Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0551317Z self_outputs = self.self( 2025-09-07T08:41:32.0551685Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0552114Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0552605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0553108Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0553579Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0554007Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0554325Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0554649Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0554790Z 2025-09-07T08:41:32.0554940Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0555437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0555883Z layer_outputs = layer_module( 2025-09-07T08:41:32.0556207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0556536Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0556925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0557304Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0557689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0558067Z self_outputs = self.self( 2025-09-07T08:41:32.0558437Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0558859Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0559340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0559862Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0560058Z 2025-09-07T08:41:32.0560157Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0560633Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0561081Z layer_outputs = layer_module( 2025-09-07T08:41:32.0561396Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0561733Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0562120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0562506Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0562889Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0563263Z self_outputs = self.self( 2025-09-07T08:41:32.0563634Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0564055Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0564537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0565056Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0565245Z 2025-09-07T08:41:32.0565342Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0565817Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0566259Z layer_outputs = layer_module( 2025-09-07T08:41:32.0566580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0566918Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0567297Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0567685Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0568102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0568511Z self_outputs = self.self( 2025-09-07T08:41:32.0568873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0569363Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0569593Z 2025-09-07T08:41:32.0569690Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0570174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0570617Z layer_outputs = layer_module( 2025-09-07T08:41:32.0570936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0571262Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0571654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0572041Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0572425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0572835Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0573250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0573640Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0573768Z 2025-09-07T08:41:32.0573855Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0574056Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0574242Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0574436Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0574655Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0575131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0575571Z layer_outputs = layer_module( 2025-09-07T08:41:32.0575892Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0576228Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0576613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0577000Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0577380Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0577765Z self_outputs = self.self( 2025-09-07T08:41:32.0578137Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0578529Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0578661Z 2025-09-07T08:41:32.0578765Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0579231Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0579678Z layer_outputs = layer_module( 2025-09-07T08:41:32.0579999Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0580335Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0580727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0581201Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0581654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0582052Z self_outputs = self.self( 2025-09-07T08:41:32.0582425Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0582815Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0582955Z 2025-09-07T08:41:32.0583056Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0583550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0584020Z layer_outputs = layer_module( 2025-09-07T08:41:32.0584353Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0584692Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0585090Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0585482Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0585880Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0586274Z self_outputs = self.self( 2025-09-07T08:41:32.0586646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0587070Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0587546Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0588105Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0588338Z 2025-09-07T08:41:32.0588444Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0588925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0589386Z layer_outputs = layer_module( 2025-09-07T08:41:32.0589717Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0590060Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0590455Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0590839Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0591235Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0591629Z self_outputs = self.self( 2025-09-07T08:41:32.0592006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0592399Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0592527Z 2025-09-07T08:41:32.0592622Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0593105Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0593560Z layer_outputs = layer_module( 2025-09-07T08:41:32.0593887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0594229Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0594651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0595092Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0595478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0595856Z self_outputs = self.self( 2025-09-07T08:41:32.0596216Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0596628Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0597087Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0597596Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0598069Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0598457Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0598592Z 2025-09-07T08:41:32.0598692Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0599179Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0599637Z layer_outputs = layer_module( 2025-09-07T08:41:32.0599969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0600304Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0600702Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0601095Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0601491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0601885Z self_outputs = self.self( 2025-09-07T08:41:32.0602259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0602679Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0603156Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0603711Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0603937Z 2025-09-07T08:41:32.0604044Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0604541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0604993Z layer_outputs = layer_module( 2025-09-07T08:41:32.0605315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0605648Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0606033Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0606407Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0606791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0607171Z self_outputs = self.self( 2025-09-07T08:41:32.0607534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0607942Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0608453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0609002Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0609231Z 2025-09-07T08:41:32.0609326Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0609799Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0610246Z layer_outputs = layer_module( 2025-09-07T08:41:32.0610562Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0610895Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0611278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0611663Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0612040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0612414Z self_outputs = self.self( 2025-09-07T08:41:32.0612779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0613185Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0613642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0614169Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0614396Z 2025-09-07T08:41:32.0614472Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0614679Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0614902Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0615372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0615805Z layer_outputs = layer_module( 2025-09-07T08:41:32.0616127Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0616461Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0616846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0617229Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0617607Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0617990Z self_outputs = self.self( 2025-09-07T08:41:32.0618368Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0618747Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0618856Z 2025-09-07T08:41:32.0618960Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0619423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0619870Z layer_outputs = layer_module( 2025-09-07T08:41:32.0620189Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0620521Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0620896Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0621326Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0621734Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0622112Z self_outputs = self.self( 2025-09-07T08:41:32.0622479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0622861Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0622992Z 2025-09-07T08:41:32.0623088Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0623558Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0624008Z layer_outputs = layer_module( 2025-09-07T08:41:32.0624329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0624656Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0625045Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0625426Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0625809Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0626183Z self_outputs = self.self( 2025-09-07T08:41:32.0626543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0626933Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0627069Z 2025-09-07T08:41:32.0627164Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0627637Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0628080Z layer_outputs = layer_module( 2025-09-07T08:41:32.0628391Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0628722Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0629107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0629494Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0629870Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0630248Z self_outputs = self.self( 2025-09-07T08:41:32.0630613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0631040Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0631535Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0632069Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0632466Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0632813Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0632954Z 2025-09-07T08:41:32.0633061Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0633547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0633998Z layer_outputs = layer_module( 2025-09-07T08:41:32.0634364Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0634739Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0635142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0635539Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0635931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0636325Z self_outputs = self.self( 2025-09-07T08:41:32.0636695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0637120Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0637604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0638101Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0638572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0639001Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0639316Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0639645Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0639783Z 2025-09-07T08:41:32.0639880Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0640351Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0640798Z layer_outputs = layer_module( 2025-09-07T08:41:32.0641120Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0641462Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0641842Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0642227Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0642615Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0642994Z self_outputs = self.self( 2025-09-07T08:41:32.0643250Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0643366Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0643687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0643831Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0643834Z 2025-09-07T08:41:32.0643938Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0644260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0644334Z layer_outputs = layer_module( 2025-09-07T08:41:32.0644539Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0644612Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0644874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0644942Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0645262Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0645342Z self_outputs = self.self( 2025-09-07T08:41:32.0645605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0645714Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0646038Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0646185Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0646188Z 2025-09-07T08:41:32.0646283Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0646616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0646684Z layer_outputs = layer_module( 2025-09-07T08:41:32.0646894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0646967Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0647225Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0647300Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0647556Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0647628Z self_outputs = self.self( 2025-09-07T08:41:32.0647885Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0648062Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0648075Z 2025-09-07T08:41:32.0648173Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0648499Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0648572Z layer_outputs = layer_module( 2025-09-07T08:41:32.0648775Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0648854Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0649111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0649179Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0649447Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0649554Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0649819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0649898Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0649901Z 2025-09-07T08:41:32.0649983Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0650055Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0650126Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0650204Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0650297Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0650631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0650696Z layer_outputs = layer_module( 2025-09-07T08:41:32.0650941Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0651043Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0651299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0651376Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0651628Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0651691Z self_outputs = self.self( 2025-09-07T08:41:32.0651955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0652034Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0652037Z 2025-09-07T08:41:32.0652142Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0652469Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0652544Z layer_outputs = layer_module( 2025-09-07T08:41:32.0652745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0652817Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0653082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0653149Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0653414Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0653477Z self_outputs = self.self( 2025-09-07T08:41:32.0653739Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0653825Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0653828Z 2025-09-07T08:41:32.0653921Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0654247Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0654313Z layer_outputs = layer_module( 2025-09-07T08:41:32.0654520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0654591Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0654846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0654924Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0655182Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0655257Z self_outputs = self.self( 2025-09-07T08:41:32.0655515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0655613Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0655929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0656102Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0656105Z 2025-09-07T08:41:32.0656209Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0656560Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0656651Z layer_outputs = layer_module( 2025-09-07T08:41:32.0656869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0656939Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0657205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0657274Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0657534Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0657598Z self_outputs = self.self( 2025-09-07T08:41:32.0657859Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0657935Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0657941Z 2025-09-07T08:41:32.0658037Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0658372Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0658436Z layer_outputs = layer_module( 2025-09-07T08:41:32.0658647Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0658719Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0658972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0659046Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0659299Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0659374Z self_outputs = self.self( 2025-09-07T08:41:32.0659630Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0659735Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0660042Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0660188Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0660453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0660522Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0660526Z 2025-09-07T08:41:32.0660627Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0660948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0661023Z layer_outputs = layer_module( 2025-09-07T08:41:32.0661222Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0661292Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0661557Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0661623Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0661884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0661948Z self_outputs = self.self( 2025-09-07T08:41:32.0662200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0662349Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0662693Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0662872Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0662875Z 2025-09-07T08:41:32.0662971Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0663296Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0663361Z layer_outputs = layer_module( 2025-09-07T08:41:32.0663563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0663643Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0663901Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0663977Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0664234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0664305Z self_outputs = self.self( 2025-09-07T08:41:32.0664559Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0664652Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0664971Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0665140Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0665143Z 2025-09-07T08:41:32.0665247Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0665568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0665644Z layer_outputs = layer_module( 2025-09-07T08:41:32.0665847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0665921Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0666187Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0666255Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0666515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0666578Z self_outputs = self.self( 2025-09-07T08:41:32.0666832Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0666935Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0667242Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0667417Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0667420Z 2025-09-07T08:41:32.0667494Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0667577Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0667671Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0667992Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0668113Z layer_outputs = layer_module( 2025-09-07T08:41:32.0668330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0668426Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0668687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0668753Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0669017Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0669083Z self_outputs = self.self( 2025-09-07T08:41:32.0669345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0669413Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0669416Z 2025-09-07T08:41:32.0669518Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0669846Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0669912Z layer_outputs = layer_module( 2025-09-07T08:41:32.0670122Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0670194Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0670461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0670529Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0670790Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0670864Z self_outputs = self.self( 2025-09-07T08:41:32.0671126Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0671208Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0671211Z 2025-09-07T08:41:32.0671307Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0671638Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0671702Z layer_outputs = layer_module( 2025-09-07T08:41:32.0671904Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0671983Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0672241Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0672316Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0672573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0672638Z self_outputs = self.self( 2025-09-07T08:41:32.0672905Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0672983Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0672986Z 2025-09-07T08:41:32.0673086Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0673408Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0673479Z layer_outputs = layer_module( 2025-09-07T08:41:32.0673680Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0673751Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0674058Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0674140Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0674399Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0674463Z self_outputs = self.self( 2025-09-07T08:41:32.0674715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0674832Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0675154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0675322Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0675503Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0675603Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0675607Z 2025-09-07T08:41:32.0675699Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0676018Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0676089Z layer_outputs = layer_module( 2025-09-07T08:41:32.0676293Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0676371Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0676627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0676705Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0676966Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0677030Z self_outputs = self.self( 2025-09-07T08:41:32.0677289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0677397Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0677729Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0677853Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0678145Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0678242Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0678422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0678521Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0678524Z 2025-09-07T08:41:32.0678619Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0678949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0679014Z layer_outputs = layer_module( 2025-09-07T08:41:32.0679214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0679295Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0679550Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0679676Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0679948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0680021Z self_outputs = self.self( 2025-09-07T08:41:32.0680279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0680387Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0680725Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0680865Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0680868Z 2025-09-07T08:41:32.0681005Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0681330Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0681408Z layer_outputs = layer_module( 2025-09-07T08:41:32.0681611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0681685Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0681950Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0682019Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0682286Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0682348Z self_outputs = self.self( 2025-09-07T08:41:32.0682605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0682731Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0683056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0683204Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0683207Z 2025-09-07T08:41:32.0683304Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0683635Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0683704Z layer_outputs = layer_module( 2025-09-07T08:41:32.0683908Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0683996Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0684259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0684340Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0684599Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0684663Z self_outputs = self.self( 2025-09-07T08:41:32.0684924Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0685094Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0685097Z 2025-09-07T08:41:32.0685198Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0685567Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0685667Z layer_outputs = layer_module( 2025-09-07T08:41:32.0685894Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0685966Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0686234Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0686303Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0686568Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0686672Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0686942Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0687024Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0687029Z 2025-09-07T08:41:32.0687107Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0687188Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0687259Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0687338Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0687433Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0687755Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0687832Z layer_outputs = layer_module( 2025-09-07T08:41:32.0688039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0688115Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0688376Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0688447Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0688712Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0688776Z self_outputs = self.self( 2025-09-07T08:41:32.0689040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0689118Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0689121Z 2025-09-07T08:41:32.0689221Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0689547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0689612Z layer_outputs = layer_module( 2025-09-07T08:41:32.0689826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0689900Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0690165Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0690231Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0690483Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0690552Z self_outputs = self.self( 2025-09-07T08:41:32.0690807Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0690890Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0690892Z 2025-09-07T08:41:32.0690983Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0691345Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0691436Z layer_outputs = layer_module( 2025-09-07T08:41:32.0691648Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0691727Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0691989Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0692063Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0692317Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0692385Z self_outputs = self.self( 2025-09-07T08:41:32.0692646Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0692744Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0693064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0693235Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0693239Z 2025-09-07T08:41:32.0693342Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0693661Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0693725Z layer_outputs = layer_module( 2025-09-07T08:41:32.0693936Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0694006Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0694272Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0694342Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0694605Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0694668Z self_outputs = self.self( 2025-09-07T08:41:32.0694920Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0694999Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0695002Z 2025-09-07T08:41:32.0695094Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0695422Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0695489Z layer_outputs = layer_module( 2025-09-07T08:41:32.0695700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0695773Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0696030Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0696105Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0696361Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0696433Z self_outputs = self.self( 2025-09-07T08:41:32.0696687Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0696783Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0697131Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0697301Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0697565Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0697637Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0697640Z 2025-09-07T08:41:32.0697741Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0698062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0698127Z layer_outputs = layer_module( 2025-09-07T08:41:32.0698337Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0698410Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0698677Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0698747Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0699001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0699073Z self_outputs = self.self( 2025-09-07T08:41:32.0699328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0699427Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0699738Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0699917Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0699921Z 2025-09-07T08:41:32.0700018Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0700340Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0700413Z layer_outputs = layer_module( 2025-09-07T08:41:32.0700613Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0700691Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0700947Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0701022Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0701277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0701346Z self_outputs = self.self( 2025-09-07T08:41:32.0701612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0701705Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0702019Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0702189Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0702191Z 2025-09-07T08:41:32.0702291Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0702612Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0702677Z layer_outputs = layer_module( 2025-09-07T08:41:32.0702912Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0703012Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0703279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0703348Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0703603Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0703675Z self_outputs = self.self( 2025-09-07T08:41:32.0703929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0704030Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0704341Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0704519Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0704522Z 2025-09-07T08:41:32.0704595Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0704667Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0704767Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0705091Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0705164Z layer_outputs = layer_module( 2025-09-07T08:41:32.0705367Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0705439Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0705704Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0705775Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0706040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0706103Z self_outputs = self.self( 2025-09-07T08:41:32.0706369Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0706434Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0706437Z 2025-09-07T08:41:32.0706529Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0706858Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0706921Z layer_outputs = layer_module( 2025-09-07T08:41:32.0707134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0707208Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0707463Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0707537Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0707791Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0707861Z self_outputs = self.self( 2025-09-07T08:41:32.0708116Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0708194Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0708196Z 2025-09-07T08:41:32.0708290Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0708639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0708747Z layer_outputs = layer_module( 2025-09-07T08:41:32.0708948Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0709028Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0709282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0709350Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0709616Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0709679Z self_outputs = self.self( 2025-09-07T08:41:32.0709945Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0710025Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0710030Z 2025-09-07T08:41:32.0710131Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0710453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0710517Z layer_outputs = layer_module( 2025-09-07T08:41:32.0710726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0710797Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0711059Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0711127Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0711387Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0711461Z self_outputs = self.self( 2025-09-07T08:41:32.0711715Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0711831Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0712154Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0712320Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0712498Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0712590Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0712603Z 2025-09-07T08:41:32.0712696Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0713023Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0713099Z layer_outputs = layer_module( 2025-09-07T08:41:32.0713300Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0713381Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0713636Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0713704Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0713969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0714032Z self_outputs = self.self( 2025-09-07T08:41:32.0714322Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0714459Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0714782Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0714917Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0715209Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0715305Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0715482Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0715580Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0715583Z 2025-09-07T08:41:32.0715680Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0716005Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0716080Z layer_outputs = layer_module( 2025-09-07T08:41:32.0716284Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0716361Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0716620Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0716698Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0716959Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0717022Z self_outputs = self.self( 2025-09-07T08:41:32.0717289Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0717397Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0717732Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0717874Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0717877Z 2025-09-07T08:41:32.0717978Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0718303Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0718369Z layer_outputs = layer_module( 2025-09-07T08:41:32.0718580Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0718656Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0718923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0718991Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0719249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0719320Z self_outputs = self.self( 2025-09-07T08:41:32.0719577Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0719692Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0720021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0720195Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0720226Z 2025-09-07T08:41:32.0720323Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0720645Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0720719Z layer_outputs = layer_module( 2025-09-07T08:41:32.0720923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0721001Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0721260Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0721327Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0721591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0721657Z self_outputs = self.self( 2025-09-07T08:41:32.0721921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0722093Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0722096Z 2025-09-07T08:41:32.0722197Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0722520Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0722586Z layer_outputs = layer_module( 2025-09-07T08:41:32.0722795Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0722868Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0723134Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0723205Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0723468Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0723569Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0723826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0723912Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0723915Z 2025-09-07T08:41:32.0723988Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0724069Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0724140Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0724208Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0724316Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0724639Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0724711Z layer_outputs = layer_module( 2025-09-07T08:41:32.0724915Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0724988Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0725254Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0725321Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0725588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0725653Z self_outputs = self.self( 2025-09-07T08:41:32.0725951Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0726059Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0726063Z 2025-09-07T08:41:32.0726158Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0726491Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0726558Z layer_outputs = layer_module( 2025-09-07T08:41:32.0726771Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0726844Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0727106Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0727185Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0727446Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0727520Z self_outputs = self.self( 2025-09-07T08:41:32.0727779Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0727865Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0727868Z 2025-09-07T08:41:32.0727964Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0728285Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0728360Z layer_outputs = layer_module( 2025-09-07T08:41:32.0728563Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0728648Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0728907Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0728985Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0729245Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0729308Z self_outputs = self.self( 2025-09-07T08:41:32.0729573Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0729670Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0729990Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0730163Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0730168Z 2025-09-07T08:41:32.0730265Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0730598Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0730663Z layer_outputs = layer_module( 2025-09-07T08:41:32.0730874Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0730945Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0731210Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0731279Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0731536Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0731663Z self_outputs = self.self( 2025-09-07T08:41:32.0731931Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0732013Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0732016Z 2025-09-07T08:41:32.0732109Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0732432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0732496Z layer_outputs = layer_module( 2025-09-07T08:41:32.0732695Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0732775Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0733032Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0733108Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0733363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0733426Z self_outputs = self.self( 2025-09-07T08:41:32.0733689Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0733782Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0734102Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0734244Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0734510Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0734582Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0734586Z 2025-09-07T08:41:32.0734679Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0735010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0735073Z layer_outputs = layer_module( 2025-09-07T08:41:32.0735282Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0735354Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0735606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0735682Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0735938Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0736012Z self_outputs = self.self( 2025-09-07T08:41:32.0736265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0736364Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0736672Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0736841Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0736844Z 2025-09-07T08:41:32.0736945Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0737265Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0737369Z layer_outputs = layer_module( 2025-09-07T08:41:32.0737592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0737689Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0737946Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0738014Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0738278Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0738341Z self_outputs = self.self( 2025-09-07T08:41:32.0738600Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0738691Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0739002Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0739178Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0739181Z 2025-09-07T08:41:32.0739275Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0739604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0739669Z layer_outputs = layer_module( 2025-09-07T08:41:32.0739882Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0739956Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0740215Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0740296Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0740553Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0740624Z self_outputs = self.self( 2025-09-07T08:41:32.0740879Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0740976Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0741288Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0741453Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0741456Z 2025-09-07T08:41:32.0741537Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0741608Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0741714Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0742037Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0742103Z layer_outputs = layer_module( 2025-09-07T08:41:32.0742315Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0742386Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0742650Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0742718Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0742983Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0743048Z self_outputs = self.self( 2025-09-07T08:41:32.0743355Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0743445Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0743448Z 2025-09-07T08:41:32.0743543Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0743873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0743939Z layer_outputs = layer_module( 2025-09-07T08:41:32.0744142Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0744221Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0744479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0744559Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0744819Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0744893Z self_outputs = self.self( 2025-09-07T08:41:32.0745151Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0745224Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0745227Z 2025-09-07T08:41:32.0745329Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0745651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0745722Z layer_outputs = layer_module( 2025-09-07T08:41:32.0745923Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0745997Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0746263Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0746331Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0746596Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0746659Z self_outputs = self.self( 2025-09-07T08:41:32.0746921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0746999Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0747003Z 2025-09-07T08:41:32.0747095Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0747423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0747490Z layer_outputs = layer_module( 2025-09-07T08:41:32.0747700Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0747773Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0748040Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0748108Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0748363Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0748434Z self_outputs = self.self( 2025-09-07T08:41:32.0748690Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0748840Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0749183Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0749360Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0749547Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0749637Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0749640Z 2025-09-07T08:41:32.0749742Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0750062Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0750134Z layer_outputs = layer_module( 2025-09-07T08:41:32.0750338Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0750411Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0750675Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0750743Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0751006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0751070Z self_outputs = self.self( 2025-09-07T08:41:32.0751327Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0751438Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0751761Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0751899Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0752193Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0752285Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0752461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0752550Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0752553Z 2025-09-07T08:41:32.0752652Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0752972Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0753046Z layer_outputs = layer_module( 2025-09-07T08:41:32.0753249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0753332Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0753588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0753657Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0753922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0753986Z self_outputs = self.self( 2025-09-07T08:41:32.0754249Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0754353Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0754713Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0754874Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0754892Z 2025-09-07T08:41:32.0754988Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0755318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0755383Z layer_outputs = layer_module( 2025-09-07T08:41:32.0755592Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0755663Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0755922Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0755998Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0756259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0756333Z self_outputs = self.self( 2025-09-07T08:41:32.0756585Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0756699Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0757021Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0757159Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0757162Z 2025-09-07T08:41:32.0757265Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0757588Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0757662Z layer_outputs = layer_module( 2025-09-07T08:41:32.0757867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0757948Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0758206Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0758277Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0758541Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0758605Z self_outputs = self.self( 2025-09-07T08:41:32.0758867Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0759040Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0759044Z 2025-09-07T08:41:32.0759140Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0759471Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0759535Z layer_outputs = layer_module( 2025-09-07T08:41:32.0759744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0759817Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0760076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0760144Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0760398Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0760554Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0760821Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0760906Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0760909Z 2025-09-07T08:41:32.0760981Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0761053Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0761129Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0761199Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0761300Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0761629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0761693Z layer_outputs = layer_module( 2025-09-07T08:41:32.0761911Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0761987Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0762259Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0762329Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0762597Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0762661Z self_outputs = self.self( 2025-09-07T08:41:32.0762921Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0763007Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0763010Z 2025-09-07T08:41:32.0763102Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0763436Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0763503Z layer_outputs = layer_module( 2025-09-07T08:41:32.0763719Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0763791Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0764055Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0764133Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0764395Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0764464Z self_outputs = self.self( 2025-09-07T08:41:32.0764726Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0764805Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0764809Z 2025-09-07T08:41:32.0764912Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0765238Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0765310Z layer_outputs = layer_module( 2025-09-07T08:41:32.0765515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0765592Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0765852Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0765918Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0766237Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0766333Z self_outputs = self.self( 2025-09-07T08:41:32.0766594Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0766688Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0767001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0767179Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0767182Z 2025-09-07T08:41:32.0767277Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0767606Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0767673Z layer_outputs = layer_module( 2025-09-07T08:41:32.0767884Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0767957Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0768214Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0768292Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0768548Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0768619Z self_outputs = self.self( 2025-09-07T08:41:32.0768873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0768946Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0768956Z 2025-09-07T08:41:32.0769051Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0769377Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0769450Z layer_outputs = layer_module( 2025-09-07T08:41:32.0769654Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0769736Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0769995Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0770064Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0770329Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0770392Z self_outputs = self.self( 2025-09-07T08:41:32.0770658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0770756Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0771076Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0771222Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0771478Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0771560Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0771563Z 2025-09-07T08:41:32.0771657Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0772010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0772105Z layer_outputs = layer_module( 2025-09-07T08:41:32.0772320Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0772393Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0772655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0772736Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0773000Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0773072Z self_outputs = self.self( 2025-09-07T08:41:32.0773328Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0773425Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0773748Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0773917Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0773919Z 2025-09-07T08:41:32.0774023Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0774346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0774421Z layer_outputs = layer_module( 2025-09-07T08:41:32.0774624Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0774694Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0774967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0775039Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0775307Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0775371Z self_outputs = self.self( 2025-09-07T08:41:32.0775629Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0775733Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0776043Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0776219Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0776222Z 2025-09-07T08:41:32.0776319Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0776655Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0776721Z layer_outputs = layer_module( 2025-09-07T08:41:32.0776925Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0777008Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0777267Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0777344Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0777604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0777673Z self_outputs = self.self( 2025-09-07T08:41:32.0777967Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0778088Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0778402Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0778569Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0778572Z 2025-09-07T08:41:32.0778654Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0778728Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0778820Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0779146Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0779211Z layer_outputs = layer_module( 2025-09-07T08:41:32.0779423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0779494Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0779757Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0779825Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0780079Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0780153Z self_outputs = self.self( 2025-09-07T08:41:32.0780406Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0780479Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0780482Z 2025-09-07T08:41:32.0780574Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0780893Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0781000Z layer_outputs = layer_module( 2025-09-07T08:41:32.0781203Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0781286Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0781543Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0781619Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0781876Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0781942Z self_outputs = self.self( 2025-09-07T08:41:32.0782207Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0782283Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0782286Z 2025-09-07T08:41:32.0782391Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0782714Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0782791Z layer_outputs = layer_module( 2025-09-07T08:41:32.0782993Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0783067Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0783336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0783407Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0783716Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0783826Z self_outputs = self.self( 2025-09-07T08:41:32.0784082Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0784168Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0784171Z 2025-09-07T08:41:32.0784265Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0784591Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0784655Z layer_outputs = layer_module( 2025-09-07T08:41:32.0784869Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0784940Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0785200Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0785281Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0785537Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0785610Z self_outputs = self.self( 2025-09-07T08:41:32.0785866Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0785975Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0786310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0786470Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0786658Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0786754Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0786757Z 2025-09-07T08:41:32.0786858Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0787177Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0787241Z layer_outputs = layer_module( 2025-09-07T08:41:32.0787452Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0787522Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0787786Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0787852Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0788112Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0788188Z self_outputs = self.self( 2025-09-07T08:41:32.0788443Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0788560Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0788887Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0789020Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0789310Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0789397Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0789623Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0789728Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0789731Z 2025-09-07T08:41:32.0789836Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0790155Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0790230Z layer_outputs = layer_module( 2025-09-07T08:41:32.0790432Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0790503Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0790768Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0790841Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0791103Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0791166Z self_outputs = self.self( 2025-09-07T08:41:32.0791423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0791535Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0791855Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0792029Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0792032Z 2025-09-07T08:41:32.0792127Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0792460Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0792527Z layer_outputs = layer_module( 2025-09-07T08:41:32.0792728Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0792812Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0793068Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0793142Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0793397Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0793470Z self_outputs = self.self( 2025-09-07T08:41:32.0793723Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0793831Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0794167Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0794303Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0794306Z 2025-09-07T08:41:32.0794409Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0794730Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0794805Z layer_outputs = layer_module( 2025-09-07T08:41:32.0795006Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0795079Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0795371Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0795475Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0795744Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0795807Z self_outputs = self.self( 2025-09-07T08:41:32.0796064Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0796245Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0796248Z 2025-09-07T08:41:32.0796342Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0796670Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0796738Z layer_outputs = layer_module( 2025-09-07T08:41:32.0796949Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0797022Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0797279Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0797354Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0797611Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0797723Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0797976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0798053Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0798064Z 2025-09-07T08:41:32.0798140Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0798214Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0798290Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0798359Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0798453Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0798781Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0798843Z layer_outputs = layer_module( 2025-09-07T08:41:32.0799054Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0799125Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0799389Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0799460Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0799718Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0799790Z self_outputs = self.self( 2025-09-07T08:41:32.0800046Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0800130Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0800133Z 2025-09-07T08:41:32.0800227Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0800554Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0800618Z layer_outputs = layer_module( 2025-09-07T08:41:32.0800820Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0800945Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0801217Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0801296Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0801551Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0801614Z self_outputs = self.self( 2025-09-07T08:41:32.0801873Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 509, in forward 2025-09-07T08:41:32.0801950Z query_vectors = self.query(hidden_states) 2025-09-07T08:41:32.0801953Z 2025-09-07T08:41:32.0802051Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0802373Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0802450Z layer_outputs = layer_module( 2025-09-07T08:41:32.0802651Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0802723Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0802984Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0803053Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0803314Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0803378Z self_outputs = self.self( 2025-09-07T08:41:32.0803631Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0803746Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0804056Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0804236Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0804238Z 2025-09-07T08:41:32.0804334Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0804664Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0804729Z layer_outputs = layer_module( 2025-09-07T08:41:32.0804930Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0805009Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0805266Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0805348Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0805602Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0805673Z self_outputs = self.self( 2025-09-07T08:41:32.0805929Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 510, in forward 2025-09-07T08:41:32.0806002Z key_vectors = self.key(hidden_states) 2025-09-07T08:41:32.0806005Z 2025-09-07T08:41:32.0806107Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0806426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0806500Z layer_outputs = layer_module( 2025-09-07T08:41:32.0806727Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0806832Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0807096Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0807163Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0807426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0807489Z self_outputs = self.self( 2025-09-07T08:41:32.0807752Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0807847Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0808159Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 790, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0808316Z key = self._chunk(key, window_overlap, getattr(self.config, "onnx_export", False)) 2025-09-07T08:41:32.0808572Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 719, in _chunk 2025-09-07T08:41:32.0808649Z hidden_states = hidden_states.view( 2025-09-07T08:41:32.0808652Z 2025-09-07T08:41:32.0808745Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0809075Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0809139Z layer_outputs = layer_module( 2025-09-07T08:41:32.0809339Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0809418Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0809676Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0809754Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0810010Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0810075Z self_outputs = self.self( 2025-09-07T08:41:32.0810336Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0810430Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0810745Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0810914Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0810917Z 2025-09-07T08:41:32.0811025Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0811346Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0811411Z layer_outputs = layer_module( 2025-09-07T08:41:32.0811621Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0811694Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0811955Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0812021Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0812277Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0812348Z self_outputs = self.self( 2025-09-07T08:41:32.0812642Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0812759Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0813078Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0813257Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0813260Z 2025-09-07T08:41:32.0813355Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0813684Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0813759Z layer_outputs = layer_module( 2025-09-07T08:41:32.0813969Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0814056Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0814318Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0814398Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0814662Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0814726Z self_outputs = self.self( 2025-09-07T08:41:32.0814994Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 524, in forward 2025-09-07T08:41:32.0815088Z attn_scores = self._sliding_chunks_query_key_matmul( 2025-09-07T08:41:32.0815415Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 796, in _sliding_chunks_query_key_matmul 2025-09-07T08:41:32.0815586Z diagonal_chunked_attention_scores = torch.einsum("bcxd,bcyd->bcxy", (query, key)) # multiply 2025-09-07T08:41:32.0815591Z 2025-09-07T08:41:32.0815672Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0815745Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0815838Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0816173Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0816240Z layer_outputs = layer_module( 2025-09-07T08:41:32.0816453Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0816527Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0816794Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0816876Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0817138Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0817213Z self_outputs = self.self( 2025-09-07T08:41:32.0817474Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 541, in forward 2025-09-07T08:41:32.0817544Z attn_scores += diagonal_mask 2025-09-07T08:41:32.0817555Z 2025-09-07T08:41:32.0817650Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0817976Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0818054Z layer_outputs = layer_module( 2025-09-07T08:41:32.0818291Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0818397Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0818667Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0818734Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0819001Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0819063Z self_outputs = self.self( 2025-09-07T08:41:32.0819324Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 579, in forward 2025-09-07T08:41:32.0819397Z attn_probs = nn.functional.softmax( 2025-09-07T08:41:32.0819400Z 2025-09-07T08:41:32.0819501Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0819822Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0819889Z layer_outputs = layer_module( 2025-09-07T08:41:32.0820095Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0820167Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0820426Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0820492Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0820747Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0820818Z self_outputs = self.self( 2025-09-07T08:41:32.0821074Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 511, in forward 2025-09-07T08:41:32.0821163Z value_vectors = self.value(hidden_states) 2025-09-07T08:41:32.0821167Z 2025-09-07T08:41:32.0821262Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0821587Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0821651Z layer_outputs = layer_module( 2025-09-07T08:41:32.0821850Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0821928Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0822184Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0822258Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0822515Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0822582Z self_outputs = self.self( 2025-09-07T08:41:32.0822847Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0822954Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0823281Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 863, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0823439Z padded_value = nn.functional.pad(value, (0, 0, window_overlap, window_overlap), value=-1) 2025-09-07T08:41:32.0823627Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0823719Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0823723Z 2025-09-07T08:41:32.0823817Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0824174Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0824268Z layer_outputs = layer_module( 2025-09-07T08:41:32.0824475Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0824546Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0824808Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0824876Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0825135Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0825207Z self_outputs = self.self( 2025-09-07T08:41:32.0825461Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0825580Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0849826Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 876, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0850071Z chunked_attn_probs = self._pad_and_diagonalize(chunked_attn_probs) 2025-09-07T08:41:32.0850423Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 699, in _pad_and_diagonalize 2025-09-07T08:41:32.0850529Z chunked_hidden_states = nn.functional.pad( 2025-09-07T08:41:32.0850724Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/nn/functional.py", line 5294, in pad 2025-09-07T08:41:32.0850835Z return torch._C._nn.pad(input, pad, mode, value) 2025-09-07T08:41:32.0850841Z 2025-09-07T08:41:32.0850948Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0851309Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0851390Z layer_outputs = layer_module( 2025-09-07T08:41:32.0851604Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0851691Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0851957Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0852045Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0852311Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0852383Z self_outputs = self.self( 2025-09-07T08:41:32.0852657Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0852777Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0853111Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0853255Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0853259Z 2025-09-07T08:41:32.0853371Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0853701Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0853771Z layer_outputs = layer_module( 2025-09-07T08:41:32.0853987Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0854157Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0854457Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0854554Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0854825Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0854894Z self_outputs = self.self( 2025-09-07T08:41:32.0855160Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 613, in forward 2025-09-07T08:41:32.0855281Z attn_output = self._sliding_chunks_matmul_attn_probs_value( 2025-09-07T08:41:32.0855617Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 878, in _sliding_chunks_matmul_attn_probs_value 2025-09-07T08:41:32.0855770Z context = torch.einsum("bcwd,bcdh->bcwh", (chunked_attn_probs, chunked_value)) 2025-09-07T08:41:32.0855775Z 2025-09-07T08:41:32.0855876Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0856205Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0856284Z layer_outputs = layer_module( 2025-09-07T08:41:32.0856494Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0856582Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0856839Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0856921Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0857180Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1135, in forward 2025-09-07T08:41:32.0857252Z self_outputs = self.self( 2025-09-07T08:41:32.0857521Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 618, in forward 2025-09-07T08:41:32.0857706Z attn_output = attn_output.transpose(0, 1).reshape(seq_len, batch_size, embed_dim).contiguous() 2025-09-07T08:41:32.0857710Z 2025-09-07T08:41:32.0857807Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:41:32.0858129Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1259, in torch_dynamo_resume_in_forward_at_1244 2025-09-07T08:41:32.0858205Z layer_outputs = layer_module( 2025-09-07T08:41:32.0858413Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/modeling_layers.py", line 94, in __call__ 2025-09-07T08:41:32.0858499Z return super().__call__(*args, **kwargs) 2025-09-07T08:41:32.0858765Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1199, in forward 2025-09-07T08:41:32.0858851Z self_attn_outputs = self.attention( 2025-09-07T08:41:32.0859107Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1144, in forward 2025-09-07T08:41:32.0859212Z attn_output = self.output(self_outputs[0], hidden_states) 2025-09-07T08:41:32.0859479Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1094, in forward 2025-09-07T08:41:32.0859558Z hidden_states = self.dense(hidden_states) 2025-09-07T08:41:32.0859561Z 2025-09-07T08:41:32.0859646Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0859723Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0859795Z cudagraph partition due to non gpu ops 2025-09-07T08:41:32.0859876Z cudagraph partition due to non gpu ops 2025-09-07T08:42:24.4291918Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:42:24.4293039Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1716, in torch_dynamo_resume_in_forward_at_1703 2025-09-07T08:42:24.4293596Z prediction_scores = self.lm_head(sequence_output) 2025-09-07T08:42:24.4294028Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1333, in forward 2025-09-07T08:42:24.4294429Z x = self.dense(features) 2025-09-07T08:42:24.4294542Z 2025-09-07T08:42:24.4294624Z cudagraph partition due to non gpu ops 2025-09-07T08:42:24.4294830Z cudagraph partition due to non gpu ops 2025-09-07T08:42:24.4295026Z cudagraph partition due to non gpu ops 2025-09-07T08:42:24.4295220Z cudagraph partition due to non gpu ops 2025-09-07T08:42:24.4295434Z cudagraph partition due to non gpu ops. Found from : 2025-09-07T08:42:24.4295919Z File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/transformers/models/longformer/modeling_longformer.py", line 1723, in torch_dynamo_resume_in_forward_at_1703 2025-09-07T08:42:24.4296484Z masked_lm_loss = loss_fct(prediction_scores.view(-1, self.config.vocab_size), labels.view(-1)) 2025-09-07T08:42:24.4296714Z 2025-09-07T08:42:31.6469488Z 2025-09-07T08:42:33.8899099Z running benchmark: 0% 0/30 [00:00> $GITHUB_ENV 2025-09-07T09:22:00.5934972Z echo "DEVICE_TYPE=$DEVICE_TYPE" >> $GITHUB_ENV 2025-09-07T09:22:00.5943463Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:00.5943723Z env: 2025-09-07T09:22:00.5943900Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:00.5944220Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:00.5944537Z ##[endgroup] 2025-09-07T09:22:00.5969547Z + [[ -n '' ]] 2025-09-07T09:22:00.5969897Z + python3 -mpip install boto3==1.35.33 psutil==7.0.0 pynvml==12.0.0 2025-09-07T09:22:00.7704515Z Defaulting to user installation because normal site-packages is not writeable 2025-09-07T09:22:01.4987385Z Collecting boto3==1.35.33 2025-09-07T09:22:01.5100104Z Downloading boto3-1.35.33-py3-none-any.whl (139 kB) 2025-09-07T09:22:01.7105069Z Collecting psutil==7.0.0 2025-09-07T09:22:01.7127132Z Downloading psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (277 kB) 2025-09-07T09:22:01.7381562Z Collecting pynvml==12.0.0 2025-09-07T09:22:01.7405959Z Downloading pynvml-12.0.0-py3-none-any.whl (26 kB) 2025-09-07T09:22:01.7767069Z Collecting s3transfer<0.11.0,>=0.10.0 2025-09-07T09:22:01.7791645Z Downloading s3transfer-0.10.4-py3-none-any.whl (83 kB) 2025-09-07T09:22:01.7836890Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.33) (0.10.0) 2025-09-07T09:22:02.5089174Z Collecting botocore<1.36.0,>=1.35.33 2025-09-07T09:22:02.5110664Z Downloading botocore-1.35.99-py3-none-any.whl (13.3 MB) 2025-09-07T09:22:02.6321993Z Collecting nvidia-ml-py<13.0.0a0,>=12.0.0 2025-09-07T09:22:02.6344952Z Downloading nvidia_ml_py-12.575.51-py3-none-any.whl (47 kB) 2025-09-07T09:22:02.6411658Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.25.10) 2025-09-07T09:22:02.6416353Z Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.33->boto3==1.35.33) (2.8.1) 2025-09-07T09:22:02.7677888Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil<3.0.0,>=2.1->botocore<1.36.0,>=1.35.33->boto3==1.35.33) (1.15.0) 2025-09-07T09:22:02.8567391Z Installing collected packages: botocore, s3transfer, nvidia-ml-py, pynvml, psutil, boto3 2025-09-07T09:22:03.1695077Z Attempting uninstall: nvidia-ml-py 2025-09-07T09:22:03.1695484Z Found existing installation: nvidia-ml-py 11.525.84 2025-09-07T09:22:03.1702846Z Uninstalling nvidia-ml-py-11.525.84: 2025-09-07T09:22:03.1815888Z Successfully uninstalled nvidia-ml-py-11.525.84 2025-09-07T09:22:03.2237049Z Attempting uninstall: psutil 2025-09-07T09:22:03.2237307Z Found existing installation: psutil 5.9.8 2025-09-07T09:22:03.2277499Z Uninstalling psutil-5.9.8: 2025-09-07T09:22:03.2281196Z Successfully uninstalled psutil-5.9.8 2025-09-07T09:22:03.3455368Z Successfully installed boto3-1.35.33 botocore-1.35.99 nvidia-ml-py-12.575.51 psutil-7.0.0 pynvml-12.0.0 s3transfer-0.10.4 2025-09-07T09:22:03.4315768Z + DEVICE_NAME= 2025-09-07T09:22:03.4316032Z + DEVICE_TYPE= 2025-09-07T09:22:03.4316240Z + command -v nvidia-smi 2025-09-07T09:22:03.4316472Z + command -v rocminfo 2025-09-07T09:22:03.4316901Z + echo DEVICE_NAME= 2025-09-07T09:22:03.4341988Z + echo DEVICE_TYPE= 2025-09-07T09:22:03.4357560Z ##[group]Run set -eux 2025-09-07T09:22:03.4357758Z set -eux 2025-09-07T09:22:03.4357943Z  2025-09-07T09:22:03.4358100Z if [[ -z "${GITHUB_TOKEN}" ]]; then 2025-09-07T09:22:03.4358332Z  echo "Missing github-token input" 2025-09-07T09:22:03.4358532Z  exit 1 2025-09-07T09:22:03.4358679Z fi 2025-09-07T09:22:03.4364061Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:03.4364285Z env: 2025-09-07T09:22:03.4364440Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:03.4364731Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:03.4365031Z DEVICE_NAME: 2025-09-07T09:22:03.4365172Z DEVICE_TYPE: 2025-09-07T09:22:03.4365526Z GITHUB_TOKEN: *** 2025-09-07T09:22:03.4365682Z ##[endgroup] 2025-09-07T09:22:03.4385014Z + [[ -z *** ]] 2025-09-07T09:22:03.4412169Z ##[group]Run pytorch/test-infra/.github/actions/get-workflow-job-id@main 2025-09-07T09:22:03.4412437Z with: 2025-09-07T09:22:03.4412689Z github-token: *** 2025-09-07T09:22:03.4412840Z env: 2025-09-07T09:22:03.4412999Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:03.4413279Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:03.4413591Z DEVICE_NAME: 2025-09-07T09:22:03.4413740Z DEVICE_TYPE: 2025-09-07T09:22:03.4413885Z ##[endgroup] 2025-09-07T09:22:03.4422416Z ##[group]Run set -eux 2025-09-07T09:22:03.4422592Z set -eux 2025-09-07T09:22:03.4422736Z  2025-09-07T09:22:03.4423010Z python3 "${GITHUB_ACTION_PATH}/../../scripts/get_workflow_job_id.py" "${GITHUB_RUN_ID}" "${RUNNER_NAME}" 2025-09-07T09:22:03.4427322Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:03.4427548Z env: 2025-09-07T09:22:03.4427694Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:03.4427977Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:03.4428256Z DEVICE_NAME: 2025-09-07T09:22:03.4428401Z DEVICE_TYPE: 2025-09-07T09:22:03.4428660Z GITHUB_TOKEN: *** 2025-09-07T09:22:03.4429471Z ##[endgroup] 2025-09-07T09:22:03.4447354Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/get-workflow-job-id/../../scripts/get_workflow_job_id.py 17525285611 i-081e6be8c4291059d 2025-09-07T09:22:04.0688166Z setting job-id=49775585769 2025-09-07T09:22:04.0688593Z setting job-name=inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T09:22:04.0759256Z ##[group]Run set -eux 2025-09-07T09:22:04.0759441Z set -eux 2025-09-07T09:22:04.0759591Z  2025-09-07T09:22:04.0759735Z if [[ -n "" ]]; then 2025-09-07T09:22:04.0759908Z  source "" 2025-09-07T09:22:04.0760053Z fi 2025-09-07T09:22:04.0760246Z  2025-09-07T09:22:04.0760479Z python3 "${GITHUB_ACTION_PATH}/../../scripts/benchmarks/gather_metadata.py" \ 2025-09-07T09:22:04.0760776Z  --schema-version "${SCHEMA_VERSION}" \ 2025-09-07T09:22:04.0760978Z  --repo "${REPO}" \ 2025-09-07T09:22:04.0761172Z  --head-branch "${HEAD_BRANCH}" \ 2025-09-07T09:22:04.0761422Z  --head-sha "${HEAD_SHA}" \ 2025-09-07T09:22:04.0761632Z  --workflow-id "${WORKFLOW_RUN_ID}" \ 2025-09-07T09:22:04.0761840Z  --run-attempt "${RUN_ATTEMPT}" \ 2025-09-07T09:22:04.0762035Z  --job-id "${JOB_ID}" \ 2025-09-07T09:22:04.0762223Z  --job-name "${JOB_NAME}" 2025-09-07T09:22:04.0765790Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:04.0766013Z env: 2025-09-07T09:22:04.0766155Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:04.0766433Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:04.0766726Z DEVICE_NAME: 2025-09-07T09:22:04.0766880Z DEVICE_TYPE: 2025-09-07T09:22:04.0767019Z SCHEMA_VERSION: v3 2025-09-07T09:22:04.0767185Z REPO: pytorch/pytorch 2025-09-07T09:22:04.0767352Z HEAD_BRANCH: refs/heads/main 2025-09-07T09:22:04.0767562Z HEAD_SHA: 93fb23d6fae7c4e82c4239a1033e522088742634 2025-09-07T09:22:04.0767766Z WORKFLOW_RUN_ID: 17525285611 2025-09-07T09:22:04.0767936Z RUN_ATTEMPT: 1 2025-09-07T09:22:04.0768083Z JOB_ID: 49775585769 2025-09-07T09:22:04.0768400Z JOB_NAME: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T09:22:04.0768732Z ##[endgroup] 2025-09-07T09:22:04.0788459Z + [[ -n '' ]] 2025-09-07T09:22:04.0789762Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/benchmarks/gather_metadata.py --schema-version v3 --repo pytorch/pytorch --head-branch refs/heads/main --head-sha 93fb23d6fae7c4e82c4239a1033e522088742634 --workflow-id 17525285611 --run-attempt 1 --job-id 49775585769 --job-name 'inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)' 2025-09-07T09:22:04.1016088Z ##[group]Run set -eux 2025-09-07T09:22:04.1016268Z set -eux 2025-09-07T09:22:04.1016426Z  2025-09-07T09:22:04.1016571Z if [[ -n "" ]]; then 2025-09-07T09:22:04.1016756Z  source "" 2025-09-07T09:22:04.1016910Z fi 2025-09-07T09:22:04.1017050Z  2025-09-07T09:22:04.1017289Z python3 "${GITHUB_ACTION_PATH}/../../scripts/benchmarks/gather_runners_info.py" 2025-09-07T09:22:04.1020914Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:04.1021141Z env: 2025-09-07T09:22:04.1021284Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:04.1021555Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:04.1021835Z DEVICE_NAME: 2025-09-07T09:22:04.1021988Z DEVICE_TYPE: 2025-09-07T09:22:04.1022137Z ##[endgroup] 2025-09-07T09:22:04.1037260Z + [[ -n '' ]] 2025-09-07T09:22:04.1037781Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/benchmarks/gather_runners_info.py 2025-09-07T09:22:04.1320377Z INFO:root:Fail to import torch to get the device name 2025-09-07T09:22:04.1395087Z ##[group]Run set -eux 2025-09-07T09:22:04.1395257Z set -eux 2025-09-07T09:22:04.1395398Z  2025-09-07T09:22:04.1395555Z # TODO (huydhn): Implement this part 2025-09-07T09:22:04.1395795Z echo "dependencies={}" >> "${GITHUB_OUTPUT}" 2025-09-07T09:22:04.1399119Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:04.1399347Z env: 2025-09-07T09:22:04.1399485Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:04.1399764Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:04.1400058Z DEVICE_NAME: 2025-09-07T09:22:04.1400263Z DEVICE_TYPE: 2025-09-07T09:22:04.1400408Z ##[endgroup] 2025-09-07T09:22:04.1415547Z + echo 'dependencies={}' 2025-09-07T09:22:04.1429376Z ##[group]Run set -eux 2025-09-07T09:22:04.1429571Z set -eux 2025-09-07T09:22:04.1429712Z  2025-09-07T09:22:04.1429868Z if [[ -n "" ]]; then 2025-09-07T09:22:04.1430044Z  source "" 2025-09-07T09:22:04.1430249Z fi 2025-09-07T09:22:04.1430380Z  2025-09-07T09:22:04.1430551Z if [[ ! -d "${BENCHMARK_RESULTS_DIR}" ]]; then 2025-09-07T09:22:04.1430813Z  echo "${BENCHMARK_RESULTS_DIR} does not exist, skipping" 2025-09-07T09:22:04.1431098Z  # We don't want the job to fail if the directory doesn't exist 2025-09-07T09:22:04.1431321Z  exit 0 2025-09-07T09:22:04.1431455Z fi 2025-09-07T09:22:04.1431586Z  2025-09-07T09:22:04.1431744Z if [[ "${DRY_RUN}" == "true" ]]; then 2025-09-07T09:22:04.1432029Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-09-07T09:22:04.1432350Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-09-07T09:22:04.1432605Z  --metadata "${BENCHMARK_METADATA}" \ 2025-09-07T09:22:04.1432819Z  --runners "${RUNNER_INFO}" \ 2025-09-07T09:22:04.1433033Z  --dependencies "${DEPENDENCIES}" \ 2025-09-07T09:22:04.1433225Z  --dry-run 2025-09-07T09:22:04.1433382Z else 2025-09-07T09:22:04.1433628Z  python3 "${GITHUB_ACTION_PATH}/../../scripts/upload_benchmark_results.py" \ 2025-09-07T09:22:04.1433933Z  --benchmark-results-dir "${BENCHMARK_RESULTS_DIR}" \ 2025-09-07T09:22:04.1434178Z  --metadata "${BENCHMARK_METADATA}" \ 2025-09-07T09:22:04.1434377Z  --runners "${RUNNER_INFO}" \ 2025-09-07T09:22:04.1434585Z  --dependencies "${DEPENDENCIES}" 2025-09-07T09:22:04.1434771Z fi 2025-09-07T09:22:04.1438465Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:04.1438678Z env: 2025-09-07T09:22:04.1438833Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:04.1439112Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:04.1439403Z DEVICE_NAME: 2025-09-07T09:22:04.1439544Z DEVICE_TYPE: 2025-09-07T09:22:04.1439718Z BENCHMARK_RESULTS_DIR: test/test-reports 2025-09-07T09:22:04.1439911Z DRY_RUN: false 2025-09-07T09:22:04.1440678Z BENCHMARK_METADATA: {"timestamp": 1757236924, "schema_version": "v3", "name": "inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "93fb23d6fae7c4e82c4239a1033e522088742634", "workflow_id": 17525285611, "run_attempt": 1, "job_id": 49775585769} 2025-09-07T09:22:04.1441625Z RUNNER_INFO: [{"cpu_info": "x86_64", "cpu_count": 96, "avail_mem_in_gb": 188, "extra_info": {"hostname": "ip-10-0-37-56.ec2.internal"}, "name": "", "type": ""}] 2025-09-07T09:22:04.1441971Z DEPENDENCIES: {} 2025-09-07T09:22:04.1442120Z ##[endgroup] 2025-09-07T09:22:04.1458549Z + [[ -n '' ]] 2025-09-07T09:22:04.1458727Z + [[ ! -d test/test-reports ]] 2025-09-07T09:22:04.1458923Z + [[ false == \t\r\u\e ]] 2025-09-07T09:22:04.1460617Z + python3 /home/ec2-user/actions-runner/_work/_actions/pytorch/test-infra/main/.github/actions/upload-benchmark-results/../../scripts/upload_benchmark_results.py --benchmark-results-dir test/test-reports --metadata '{"timestamp": 1757236924, "schema_version": "v3", "name": "inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)", "repo": "pytorch/pytorch", "head_branch": "refs/heads/main", "head_sha": "93fb23d6fae7c4e82c4239a1033e522088742634", "workflow_id": 17525285611, "run_attempt": 1, "job_id": 49775585769}' --runners '[{"cpu_info": "x86_64", "cpu_count": 96, "avail_mem_in_gb": 188, "extra_info": {"hostname": "ip-10-0-37-56.ec2.internal"}, "name": "", "type": ""}]' --dependencies '{}' 2025-09-07T09:22:04.2473517Z INFO:root:Upload test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_accuracy.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_accuracy.json 2025-09-07T09:22:04.2676332Z INFO:botocore.credentials:Found credentials from IAM Role: gh-ci-github-action-runners-runner-role 2025-09-07T09:22:04.4358854Z INFO:root:Upload test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_accuracy.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_dynamic_huggingface_amp_inference_cpu_x86_accuracy.json 2025-09-07T09:22:04.5093881Z INFO:root:Upload test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_accuracy.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_accuracy.json 2025-09-07T09:22:04.6279854Z INFO:root:Upload test/test-reports/inductor_export_huggingface_amp_inference_cpu_x86_accuracy.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_export_huggingface_amp_inference_cpu_x86_accuracy.json 2025-09-07T09:22:04.6986661Z INFO:root:Upload test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_accuracy.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_accuracy.json 2025-09-07T09:22:04.7752610Z INFO:root:Upload test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance.json 2025-09-07T09:22:04.8692413Z INFO:root:Upload test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json 2025-09-07T09:22:04.9748236Z INFO:root:Upload test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance.json 2025-09-07T09:22:05.0732669Z INFO:root:Upload test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json 2025-09-07T09:22:05.1557423Z INFO:root:Upload test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance.json 2025-09-07T09:22:05.2384832Z INFO:root:Upload test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json 2025-09-07T09:22:05.3216691Z INFO:root:Upload test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance.json 2025-09-07T09:22:05.4065915Z INFO:root:Upload test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json to s3://ossci-benchmarks/v3/pytorch/pytorch/17525285611/49775585769/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json 2025-09-07T09:22:05.5222840Z ##[group]Run cat test/**/*_toprint.log || true 2025-09-07T09:22:05.5223084Z cat test/**/*_toprint.log || true 2025-09-07T09:22:05.5226949Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:05.5227240Z env: 2025-09-07T09:22:05.5227398Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:05.5227681Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:05.5227965Z DEVICE_NAME: 2025-09-07T09:22:05.5228116Z DEVICE_TYPE: 2025-09-07T09:22:05.5228268Z ##[endgroup] 2025-09-07T09:22:05.5288668Z cat: 'test/**/*_toprint.log': No such file or directory 2025-09-07T09:22:05.5304131Z ##[group]Run kill "$MONITOR_SCRIPT_PID" 2025-09-07T09:22:05.5304357Z kill "$MONITOR_SCRIPT_PID" 2025-09-07T09:22:05.5307816Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:05.5308045Z env: 2025-09-07T09:22:05.5308192Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:05.5308471Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:05.5308754Z DEVICE_NAME: 2025-09-07T09:22:05.5308903Z DEVICE_TYPE: 2025-09-07T09:22:05.5309053Z MONITOR_SCRIPT_PID: 57319 2025-09-07T09:22:05.5309227Z ##[endgroup] 2025-09-07T09:22:05.5396738Z Prepare all required actions 2025-09-07T09:22:05.5397060Z Getting action download info 2025-09-07T09:22:05.6588817Z Download action repository 'seemethere/upload-artifact-s3@v5' (SHA:baba72d0712b404f646cebe0730933554ebce96a) 2025-09-07T09:22:05.8483786Z Download action repository 'actions/upload-artifact@v4' (SHA:ea165f8d65b6e75b540449e92b4886f43607fa02) 2025-09-07T09:22:06.1372241Z ##[group]Run ./.github/actions/upload-test-artifacts 2025-09-07T09:22:06.1372461Z with: 2025-09-07T09:22:06.1372723Z file-suffix: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T09:22:06.1373016Z s3-bucket: gha-artifacts 2025-09-07T09:22:06.1373184Z env: 2025-09-07T09:22:06.1373326Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.1373609Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.1373890Z DEVICE_NAME: 2025-09-07T09:22:06.1374042Z DEVICE_TYPE: 2025-09-07T09:22:06.1374194Z ##[endgroup] 2025-09-07T09:22:06.1388931Z ##[group]Run # Remove any previous test jsons if they exist 2025-09-07T09:22:06.1389213Z # Remove any previous test jsons if they exist 2025-09-07T09:22:06.1389443Z rm -f test-jsons-*.zip 2025-09-07T09:22:06.1389696Z zip -r "test-jsons-${FILE_SUFFIX}.zip" test/test-reports -i '*.json' 2025-09-07T09:22:06.1393107Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:06.1393337Z env: 2025-09-07T09:22:06.1393489Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.1393765Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.1394055Z DEVICE_NAME: 2025-09-07T09:22:06.1394208Z DEVICE_TYPE: 2025-09-07T09:22:06.1394471Z FILE_SUFFIX: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T09:22:06.1394752Z ##[endgroup] 2025-09-07T09:22:06.1550085Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_accuracy.json (deflated 99%) 2025-09-07T09:22:06.1569975Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_accuracy.json (deflated 99%) 2025-09-07T09:22:06.1587079Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_accuracy.json (deflated 99%) 2025-09-07T09:22:06.1605206Z adding: test/test-reports/inductor_export_huggingface_amp_inference_cpu_x86_accuracy.json (deflated 99%) 2025-09-07T09:22:06.1622045Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_accuracy.json (deflated 99%) 2025-09-07T09:22:06.1649259Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance.json (deflated 99%) 2025-09-07T09:22:06.1706257Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json (deflated 99%) 2025-09-07T09:22:06.1732420Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance.json (deflated 99%) 2025-09-07T09:22:06.1788520Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json (deflated 99%) 2025-09-07T09:22:06.1815550Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance.json (deflated 99%) 2025-09-07T09:22:06.1871308Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json (deflated 99%) 2025-09-07T09:22:06.1895893Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance.json (deflated 99%) 2025-09-07T09:22:06.1930778Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.json (deflated 99%) 2025-09-07T09:22:06.1947415Z ##[group]Run # Remove any previous test reports if they exist 2025-09-07T09:22:06.1947696Z # Remove any previous test reports if they exist 2025-09-07T09:22:06.1947927Z rm -f test-reports-*.zip 2025-09-07T09:22:06.1948577Z zip -r "test-reports-${FILE_SUFFIX}.zip" test/test-reports -i '*.xml' -i '*.csv' 2025-09-07T09:22:06.1951949Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:06.1952176Z env: 2025-09-07T09:22:06.1952325Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.1952605Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.1952896Z DEVICE_NAME: 2025-09-07T09:22:06.1953047Z DEVICE_TYPE: 2025-09-07T09:22:06.1953312Z FILE_SUFFIX: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T09:22:06.1953601Z ##[endgroup] 2025-09-07T09:22:06.1995408Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_accuracy.csv (deflated 63%) 2025-09-07T09:22:06.2007612Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_accuracy.csv (deflated 63%) 2025-09-07T09:22:06.2013164Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_accuracy.csv (deflated 63%) 2025-09-07T09:22:06.2017487Z adding: test/test-reports/inductor_export_huggingface_amp_inference_cpu_x86_accuracy.csv (deflated 63%) 2025-09-07T09:22:06.2017955Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_accuracy.csv (deflated 73%) 2025-09-07T09:22:06.2018518Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance.csv (deflated 53%) 2025-09-07T09:22:06.2019989Z adding: test/test-reports/inductor_no_cudagraphs_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.csv (deflated 51%) 2025-09-07T09:22:06.2020532Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance.csv (deflated 53%) 2025-09-07T09:22:06.2021996Z adding: test/test-reports/inductor_dynamic_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.csv (deflated 50%) 2025-09-07T09:22:06.2022521Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance.csv (deflated 53%) 2025-09-07T09:22:06.2024047Z adding: test/test-reports/inductor_cpp_wrapper_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.csv (deflated 51%) 2025-09-07T09:22:06.2024583Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance.csv (deflated 54%) 2025-09-07T09:22:06.2025540Z adding: test/test-reports/inductor_aot_inductor_huggingface_amp_inference_cpu_x86_performance_compilation_metrics.csv (deflated 51%) 2025-09-07T09:22:06.2039406Z ##[group]Run # Remove any previous usage logs if they exist 2025-09-07T09:22:06.2039700Z # Remove any previous usage logs if they exist 2025-09-07T09:22:06.2039927Z rm -f logs-*.zip 2025-09-07T09:22:06.2040152Z zip "logs-${FILE_SUFFIX}.zip" 'usage_log.txt' || true 2025-09-07T09:22:06.2040457Z zip -r "logs-${FILE_SUFFIX}.zip" test/test-reports -i '*.log' || true 2025-09-07T09:22:06.2044364Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:06.2044598Z env: 2025-09-07T09:22:06.2044747Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.2045033Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.2045321Z DEVICE_NAME: 2025-09-07T09:22:06.2045478Z DEVICE_TYPE: 2025-09-07T09:22:06.2045745Z FILE_SUFFIX: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T09:22:06.2046083Z ##[endgroup] 2025-09-07T09:22:06.2116687Z adding: usage_log.txt (deflated 96%) 2025-09-07T09:22:06.2124211Z 2025-09-07T09:22:06.2124443Z zip error: Nothing to do! (logs-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip) 2025-09-07T09:22:06.2138802Z ##[group]Run # Remove any previous debugging artifacts if they exist 2025-09-07T09:22:06.2139130Z # Remove any previous debugging artifacts if they exist 2025-09-07T09:22:06.2139376Z rm -f debug-*.zip 2025-09-07T09:22:06.2139664Z if [ -d 'test/debug' ]; then 2025-09-07T09:22:06.2139918Z  zip -r "debug-${FILE_SUFFIX}.zip" test/debug 2025-09-07T09:22:06.2140134Z fi 2025-09-07T09:22:06.2143725Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:06.2143954Z env: 2025-09-07T09:22:06.2144096Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.2144383Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.2144687Z DEVICE_NAME: 2025-09-07T09:22:06.2144838Z DEVICE_TYPE: 2025-09-07T09:22:06.2145101Z FILE_SUFFIX: test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769 2025-09-07T09:22:06.2145395Z ##[endgroup] 2025-09-07T09:22:06.2194220Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-09-07T09:22:06.2194432Z with: 2025-09-07T09:22:06.2194594Z s3-bucket: gha-artifacts 2025-09-07T09:22:06.2194808Z s3-prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:06.2195039Z retention-days: 14 2025-09-07T09:22:06.2195213Z if-no-files-found: warn 2025-09-07T09:22:06.2195395Z path: test-jsons-*.zip 2025-09-07T09:22:06.2195564Z name: artifact 2025-09-07T09:22:06.2195717Z region: us-east-1 2025-09-07T09:22:06.2195859Z env: 2025-09-07T09:22:06.2196002Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.2196295Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.2196593Z DEVICE_NAME: 2025-09-07T09:22:06.2196735Z DEVICE_TYPE: 2025-09-07T09:22:06.2196879Z ##[endgroup] 2025-09-07T09:22:06.4643893Z NOTE: s3-prefix specified, ignoring name parameter 2025-09-07T09:22:06.4644160Z With the provided path, there will be 1 file uploaded 2025-09-07T09:22:06.4644432Z Uploading to s3 prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:06.4673372Z Starting upload of test-jsons-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:06.5801569Z Finished upload of test-jsons-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:06.5938588Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-09-07T09:22:06.5938793Z with: 2025-09-07T09:22:06.5938948Z s3-bucket: gha-artifacts 2025-09-07T09:22:06.5939158Z s3-prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:06.5939368Z retention-days: 14 2025-09-07T09:22:06.5939521Z if-no-files-found: error 2025-09-07T09:22:06.5939705Z path: test-reports-*.zip 2025-09-07T09:22:06.5939872Z name: artifact 2025-09-07T09:22:06.5940020Z region: us-east-1 2025-09-07T09:22:06.5940160Z env: 2025-09-07T09:22:06.5940299Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.5940575Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.5940863Z DEVICE_NAME: 2025-09-07T09:22:06.5941000Z DEVICE_TYPE: 2025-09-07T09:22:06.5941146Z ##[endgroup] 2025-09-07T09:22:06.8235101Z NOTE: s3-prefix specified, ignoring name parameter 2025-09-07T09:22:06.8235418Z With the provided path, there will be 1 file uploaded 2025-09-07T09:22:06.8235889Z Uploading to s3 prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:06.8264426Z Starting upload of test-reports-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:06.9520254Z Finished upload of test-reports-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:06.9649079Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-09-07T09:22:06.9649289Z with: 2025-09-07T09:22:06.9649443Z s3-bucket: gha-artifacts 2025-09-07T09:22:06.9649652Z s3-prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:06.9649866Z retention-days: 14 2025-09-07T09:22:06.9650024Z if-no-files-found: ignore 2025-09-07T09:22:06.9650197Z path: logs-*.zip 2025-09-07T09:22:06.9650347Z name: artifact 2025-09-07T09:22:06.9650495Z region: us-east-1 2025-09-07T09:22:06.9650633Z env: 2025-09-07T09:22:06.9650773Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:06.9651167Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:06.9651458Z DEVICE_NAME: 2025-09-07T09:22:06.9651602Z DEVICE_TYPE: 2025-09-07T09:22:06.9651752Z ##[endgroup] 2025-09-07T09:22:07.1831353Z NOTE: s3-prefix specified, ignoring name parameter 2025-09-07T09:22:07.1831656Z With the provided path, there will be 1 file uploaded 2025-09-07T09:22:07.1831951Z Uploading to s3 prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:07.1859544Z Starting upload of logs-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:07.3041792Z Finished upload of logs-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:07.3168332Z ##[group]Run seemethere/upload-artifact-s3@v5 2025-09-07T09:22:07.3168541Z with: 2025-09-07T09:22:07.3168699Z s3-bucket: gha-artifacts 2025-09-07T09:22:07.3168911Z s3-prefix: pytorch/pytorch/17525285611/1/artifact 2025-09-07T09:22:07.3169136Z retention-days: 14 2025-09-07T09:22:07.3169304Z if-no-files-found: ignore 2025-09-07T09:22:07.3169485Z path: debug-*.zip 2025-09-07T09:22:07.3169635Z name: artifact 2025-09-07T09:22:07.3169781Z region: us-east-1 2025-09-07T09:22:07.3169920Z env: 2025-09-07T09:22:07.3170060Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:07.3170337Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:07.3170630Z DEVICE_NAME: 2025-09-07T09:22:07.3170769Z DEVICE_TYPE: 2025-09-07T09:22:07.3170912Z ##[endgroup] 2025-09-07T09:22:07.5418917Z No files were found with the provided path: debug-*.zip. No artifacts will be uploaded. 2025-09-07T09:22:07.5553348Z ##[group]Run # shellcheck disable=SC2156 2025-09-07T09:22:07.5553603Z # shellcheck disable=SC2156 2025-09-07T09:22:07.5553939Z find . -iname "core.[1-9]*" -exec docker exec "${DOCKER_CONTAINER_ID}" sh -c "gdb python {} -ex 'bt' -ex 'q'" \; 2025-09-07T09:22:07.5559074Z shell: /usr/bin/bash -e {0} 2025-09-07T09:22:07.5559262Z env: 2025-09-07T09:22:07.5559408Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:07.5559685Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:07.5559978Z DEVICE_NAME: 2025-09-07T09:22:07.5560132Z DEVICE_TYPE: 2025-09-07T09:22:07.5560281Z ##[endgroup] 2025-09-07T09:22:07.7251915Z Prepare all required actions 2025-09-07T09:22:07.7252251Z Getting action download info 2025-09-07T09:22:07.8377749Z ##[group]Run ./.github/actions/upload-utilization-stats 2025-09-07T09:22:07.8377999Z with: 2025-09-07T09:22:07.8378164Z job_id: 49775585769 2025-09-07T09:22:07.8378524Z job_name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T09:22:07.8378932Z workflow_name: inductor-perf-nightly-x86 2025-09-07T09:22:07.8379155Z workflow_run_id: 17525285611 2025-09-07T09:22:07.8379344Z workflow_attempt: 1 2025-09-07T09:22:07.8379512Z env: 2025-09-07T09:22:07.8379671Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:07.8380048Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:07.8380373Z DEVICE_NAME: 2025-09-07T09:22:07.8380541Z DEVICE_TYPE: 2025-09-07T09:22:07.8380723Z ##[endgroup] 2025-09-07T09:22:07.8391809Z ##[group]Run echo "workflow_id: 17525285611" 2025-09-07T09:22:07.8392059Z echo "workflow_id: 17525285611" 2025-09-07T09:22:07.8392336Z echo "workflow_attempt: 1" 2025-09-07T09:22:07.8392591Z echo "workflow_Name: inductor-perf-nightly-x86" 2025-09-07T09:22:07.8392841Z echo "job_id: 49775585769" 2025-09-07T09:22:07.8393252Z echo "job_name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)" 2025-09-07T09:22:07.8393631Z echo "artifact_prefix: " 2025-09-07T09:22:07.8393828Z python3 --version 2025-09-07T09:22:07.8398338Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:07.8398574Z env: 2025-09-07T09:22:07.8398727Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:07.8399018Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:07.8399307Z DEVICE_NAME: 2025-09-07T09:22:07.8399464Z DEVICE_TYPE: 2025-09-07T09:22:07.8399615Z ##[endgroup] 2025-09-07T09:22:07.8417535Z workflow_id: 17525285611 2025-09-07T09:22:07.8417722Z workflow_attempt: 1 2025-09-07T09:22:07.8417907Z workflow_Name: inductor-perf-nightly-x86 2025-09-07T09:22:07.8418109Z job_id: 49775585769 2025-09-07T09:22:07.8418433Z job_name: inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal) 2025-09-07T09:22:07.8418773Z artifact_prefix: 2025-09-07T09:22:07.8427145Z Python 3.9.23 2025-09-07T09:22:07.8450061Z ##[group]Run nick-fields/retry@v3.0.0 2025-09-07T09:22:07.8450270Z with: 2025-09-07T09:22:07.8450409Z shell: bash 2025-09-07T09:22:07.8450564Z timeout_minutes: 5 2025-09-07T09:22:07.8450731Z max_attempts: 5 2025-09-07T09:22:07.8450900Z retry_wait_seconds: 30 2025-09-07T09:22:07.8451217Z command: set -eu python3 -m pip install python-dateutil==2.8.2 boto3==1.35.42 pandas==2.1.3 dataclasses_json==0.6.7 2025-09-07T09:22:07.8451560Z polling_interval_seconds: 1 2025-09-07T09:22:07.8451742Z warning_on_retry: true 2025-09-07T09:22:07.8451940Z continue_on_error: false 2025-09-07T09:22:07.8452112Z env: 2025-09-07T09:22:07.8452255Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:07.8452539Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:07.8452829Z DEVICE_NAME: 2025-09-07T09:22:07.8452981Z DEVICE_TYPE: 2025-09-07T09:22:07.8453121Z ##[endgroup] 2025-09-07T09:22:08.0801156Z Defaulting to user installation because normal site-packages is not writeable 2025-09-07T09:22:08.1310560Z Collecting python-dateutil==2.8.2 2025-09-07T09:22:08.1435351Z Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) 2025-09-07T09:22:08.7687280Z Collecting boto3==1.35.42 2025-09-07T09:22:08.7708495Z Downloading boto3-1.35.42-py3-none-any.whl (139 kB) 2025-09-07T09:22:09.1011566Z Collecting pandas==2.1.3 2025-09-07T09:22:09.1035650Z Downloading pandas-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.3 MB) 2025-09-07T09:22:09.1935380Z Requirement already satisfied: dataclasses_json==0.6.7 in /home/ec2-user/.local/lib/python3.9/site-packages (0.6.7) 2025-09-07T09:22:09.1942666Z Requirement already satisfied: six>=1.5 in /usr/lib/python3.9/site-packages (from python-dateutil==2.8.2) (1.15.0) 2025-09-07T09:22:09.1972568Z Requirement already satisfied: botocore<1.36.0,>=1.35.42 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (1.35.99) 2025-09-07T09:22:09.1976282Z Requirement already satisfied: s3transfer<0.11.0,>=0.10.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.4) 2025-09-07T09:22:09.1978847Z Requirement already satisfied: jmespath<2.0.0,>=0.7.1 in /usr/lib/python3.9/site-packages (from boto3==1.35.42) (0.10.0) 2025-09-07T09:22:09.2379000Z Requirement already satisfied: pytz>=2020.1 in /usr/lib/python3.9/site-packages (from pandas==2.1.3) (2022.7.1) 2025-09-07T09:22:09.2576245Z Collecting tzdata>=2022.1 2025-09-07T09:22:09.2597833Z Downloading tzdata-2025.2-py2.py3-none-any.whl (347 kB) 2025-09-07T09:22:09.7857304Z Collecting numpy<2,>=1.22.4 2025-09-07T09:22:09.7882978Z Downloading numpy-1.26.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB) 2025-09-07T09:22:09.9108095Z Requirement already satisfied: marshmallow<4.0.0,>=3.18.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from dataclasses_json==0.6.7) (3.26.1) 2025-09-07T09:22:09.9108847Z Requirement already satisfied: typing-inspect<1,>=0.4.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from dataclasses_json==0.6.7) (0.9.0) 2025-09-07T09:22:09.9169067Z Requirement already satisfied: urllib3<1.27,>=1.25.4 in /usr/lib/python3.9/site-packages (from botocore<1.36.0,>=1.35.42->boto3==1.35.42) (1.25.10) 2025-09-07T09:22:09.9229057Z Requirement already satisfied: packaging>=17.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from marshmallow<4.0.0,>=3.18.0->dataclasses_json==0.6.7) (25.0) 2025-09-07T09:22:09.9298385Z Requirement already satisfied: mypy-extensions>=0.3.0 in /home/ec2-user/.local/lib/python3.9/site-packages (from typing-inspect<1,>=0.4.0->dataclasses_json==0.6.7) (1.1.0) 2025-09-07T09:22:09.9300963Z Requirement already satisfied: typing-extensions>=3.7.4 in /home/ec2-user/.local/lib/python3.9/site-packages (from typing-inspect<1,>=0.4.0->dataclasses_json==0.6.7) (4.15.0) 2025-09-07T09:22:10.0612043Z Installing collected packages: python-dateutil, tzdata, numpy, pandas, boto3 2025-09-07T09:22:13.5453631Z Attempting uninstall: boto3 2025-09-07T09:22:13.5453922Z Found existing installation: boto3 1.35.33 2025-09-07T09:22:13.5510684Z Uninstalling boto3-1.35.33: 2025-09-07T09:22:13.5518850Z Successfully uninstalled boto3-1.35.33 2025-09-07T09:22:13.5892610Z Successfully installed boto3-1.35.42 numpy-1.26.4 pandas-2.1.3 python-dateutil-2.8.2 tzdata-2025.2 2025-09-07T09:22:13.9093880Z Command completed after 1 attempt(s). 2025-09-07T09:22:13.9133508Z ##[group]Run python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-09-07T09:22:13.9133927Z python3 -m tools.stats.upload_utilization_stats.upload_utilization_stats \ 2025-09-07T09:22:13.9134234Z  --workflow-run-id "17525285611" \ 2025-09-07T09:22:13.9134471Z  --workflow-name "inductor-perf-nightly-x86" \ 2025-09-07T09:22:13.9134716Z  --workflow-run-attempt "1" \ 2025-09-07T09:22:13.9134915Z  --job-id "49775585769" \ 2025-09-07T09:22:13.9135279Z  --job-name "inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)" \ 2025-09-07T09:22:13.9135650Z  --local-path "" \ 2025-09-07T09:22:13.9135846Z  --artifact-prefix "" 2025-09-07T09:22:13.9140366Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:13.9140600Z env: 2025-09-07T09:22:13.9140751Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:13.9141031Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:13.9141312Z DEVICE_NAME: 2025-09-07T09:22:13.9141465Z DEVICE_TYPE: 2025-09-07T09:22:13.9141727Z ##[endgroup] 2025-09-07T09:22:14.7199871Z repo: pytorch/pytorch 2025-09-07T09:22:14.7200171Z Search for test log in s3 bucket: ossci-utilization 2025-09-07T09:22:14.7200545Z Downloading logs-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:14.7201029Z extracting usage_log.txt from zip file logs-test-inductor_huggingface_perf_cpu_x86-1-3-linux.24xl.spr-metal_49775585769.zip 2025-09-07T09:22:14.7201394Z Converted Log Model: UtilizationMetadata: 2025-09-07T09:22:14.7202237Z UtilizationMetadata(level='metadata', workflow_id='17525285611', job_id='49775585769', workflow_name='inductor-perf-nightly-x86', job_name='inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)', usage_collect_interval=4.0, data_model_version=1.5, start_at=1757231191, gpu_count=0, cpu_count=96, gpu_type=None, error=None) 2025-09-07T09:22:14.7203311Z [Db Segments] detected pytest cmd: 13, generated segments: 13 2025-09-07T09:22:14.7203570Z [db model] Peek db timeseries 2025-09-07T09:22:14.7203803Z :{ 2025-09-07T09:22:14.7203934Z "created_at": 1757236934, 2025-09-07T09:22:14.7204122Z "type": "utilization", 2025-09-07T09:22:14.7204287Z "tags": [ 2025-09-07T09:22:14.7204431Z "record" 2025-09-07T09:22:14.7204578Z ], 2025-09-07T09:22:14.7204716Z "time_stamp": 1757231191, 2025-09-07T09:22:14.7204890Z "repo": "pytorch/pytorch", 2025-09-07T09:22:14.7205067Z "workflow_id": 17525285611, 2025-09-07T09:22:14.7205236Z "run_attempt": 1, 2025-09-07T09:22:14.7205386Z "job_id": 49775585769, 2025-09-07T09:22:14.7205585Z "workflow_name": "inductor-perf-nightly-x86", 2025-09-07T09:22:14.7205965Z "job_name": "inductor-test-nightly-freezing / test (inductor_huggingface_perf_cpu_x86, 1, 3, linux.24xl.spr-metal)", 2025-09-07T09:22:14.7206294Z "json_data": "{}" 2025-09-07T09:22:14.7206437Z } 2025-09-07T09:22:14.7206727Z Writing 1 documents to S3 ossci-utilization/util_metadata/v_1.5/pytorch/pytorch/17525285611/1/49775585769/metadata 2025-09-07T09:22:14.7207216Z Done! Finish writing document to S3 ossci-utilization/util_metadata/v_1.5/pytorch/pytorch/17525285611/1/49775585769/metadata 2025-09-07T09:22:14.7207724Z Writing 378 documents to S3 ossci-utilization/util_timeseries/v_1.5/pytorch/pytorch/17525285611/1/49775585769/time_series 2025-09-07T09:22:14.7208310Z Done! Finish writing document to S3 ossci-utilization/util_timeseries/v_1.5/pytorch/pytorch/17525285611/1/49775585769/time_series 2025-09-07T09:22:14.7971996Z ##[group]Run pytorch/test-infra/.github/actions/teardown-linux@main 2025-09-07T09:22:14.7972343Z with: 2025-09-07T09:22:14.7972531Z env: 2025-09-07T09:22:14.7972721Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:14.7973117Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:14.7973512Z DEVICE_NAME: 2025-09-07T09:22:14.7973712Z DEVICE_TYPE: 2025-09-07T09:22:14.7973897Z ##[endgroup] 2025-09-07T09:22:14.7986263Z ##[group]Run set -eou pipefail 2025-09-07T09:22:14.7986526Z set -eou pipefail 2025-09-07T09:22:14.7986705Z  2025-09-07T09:22:14.7986928Z echo "Holding runner for 2 hours until all ssh sessions have logged out" 2025-09-07T09:22:14.7987190Z for _ in $(seq 1440); do 2025-09-07T09:22:14.7987399Z  # Break if no ssh session exists anymore 2025-09-07T09:22:14.7987612Z  if [ "$(who)" = "" ]; then 2025-09-07T09:22:14.7987796Z  break 2025-09-07T09:22:14.7987982Z  fi 2025-09-07T09:22:14.7988128Z  echo "." 2025-09-07T09:22:14.7988276Z  sleep 5 2025-09-07T09:22:14.7988423Z done 2025-09-07T09:22:14.7992477Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:14.7992701Z env: 2025-09-07T09:22:14.7992840Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:14.7993112Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:14.7993397Z DEVICE_NAME: 2025-09-07T09:22:14.7993542Z DEVICE_TYPE: 2025-09-07T09:22:14.7993687Z ##[endgroup] 2025-09-07T09:22:14.8011861Z Holding runner for 2 hours until all ssh sessions have logged out 2025-09-07T09:22:14.8078303Z ##[group]Run # ignore expansion of "docker ps -q" since it could be empty 2025-09-07T09:22:14.8078635Z # ignore expansion of "docker ps -q" since it could be empty 2025-09-07T09:22:14.8078886Z # shellcheck disable=SC2046 2025-09-07T09:22:14.8079115Z docker stop $(docker ps -q) || true 2025-09-07T09:22:14.8079322Z # Prune all of the docker images 2025-09-07T09:22:14.8079521Z docker system prune -af 2025-09-07T09:22:14.8083648Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:14.8083919Z env: 2025-09-07T09:22:14.8084083Z GIT_DEFAULT_BRANCH: main 2025-09-07T09:22:14.8084425Z DOCKER_CONTAINER_ID: 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:14.8084761Z DEVICE_NAME: 2025-09-07T09:22:14.8084940Z DEVICE_TYPE: 2025-09-07T09:22:14.8085107Z ##[endgroup] 2025-09-07T09:22:25.7262917Z 7e583f71185a 2025-09-07T09:22:26.8032493Z Deleted Containers: 2025-09-07T09:22:26.8032823Z 7e583f71185a036da1f1d481c1166cec6ea26eaa094de4da7dcd0a081e913845 2025-09-07T09:22:26.8033025Z 2025-09-07T09:22:32.7978216Z Deleted Images: 2025-09-07T09:22:32.7978873Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image:pytorch-linux-jammy-py3-gcc11-inductor-benchmarks-ae53c6842aa4c2407d0ad976491ca941c2635c77 2025-09-07T09:22:32.7979596Z untagged: 308535385114.dkr.ecr.us-east-1.amazonaws.com/pytorch/ci-image@sha256:383efb45082f20b8c808cb0ba4df693a01359592233f641f1f486911ac320a9a 2025-09-07T09:22:32.7980101Z deleted: sha256:662d8c9dfc7db2f5d004293de4f2b7647941dee4c916479ef082d17fcdfd9c47 2025-09-07T09:22:32.7980473Z deleted: sha256:ea5ad443c754124b3a5a209c2663376b4c156947edef1b982a336148bbf9114d 2025-09-07T09:22:32.7980830Z deleted: sha256:284be7504f072e0c04da4e2190e8d0e1de73835ed67be81f3ddd7eafd5d06a3a 2025-09-07T09:22:32.7981878Z deleted: sha256:2f49ff4be65f7ca55de8d7028fb3df7d08232a9f043aa7ba27d9393724286281 2025-09-07T09:22:32.7982310Z deleted: sha256:f63b503fdd1cca198aecefb9eef7ffbeb5fbc723f2a8462f50316e56cd403cbc 2025-09-07T09:22:32.7982679Z deleted: sha256:f9d46e08457013f0e71d608ac3dd95b79c41120060a80baefa684048cc15574e 2025-09-07T09:22:32.7983040Z deleted: sha256:cab76e28615751b6d6a703103b1da790a67cb3a4ee2e8814de51de18ff8b595d 2025-09-07T09:22:32.7983796Z deleted: sha256:0b2d09aa482371591a32563a5db71472822abd096a347967a9bd2a177737109f 2025-09-07T09:22:32.7984163Z deleted: sha256:d306d346d5da05e9fd04284304b1637a0bf01ee97397c688d19d783d5e133de9 2025-09-07T09:22:32.7984504Z deleted: sha256:bb3381a916d410a6e304540bb0796099dc780cd11f5829e734b337e0e79acfe4 2025-09-07T09:22:32.7984861Z deleted: sha256:bcf487c27e826c092985285163fb896e3324460b1774f3eb2a66623cd31e7d87 2025-09-07T09:22:32.7985210Z deleted: sha256:7d13485a9bdc5c0e64ac5085b25f4dded75c60f74090369c1b6f3f546ee37e94 2025-09-07T09:22:32.7985560Z deleted: sha256:55351d98a4197542fa7c78089671f447a6ef88cc554b7fad4fc522e8d4d187b6 2025-09-07T09:22:32.7985909Z deleted: sha256:f884bc0c4f9a994f3b3f1d82205f3a7014b05c84ad0c1c2fa3254d15a44f31e1 2025-09-07T09:22:32.7986250Z deleted: sha256:cdd16785a15239e518604ea9ea31405d5225fa6411d1c6d74d6523bcebf759ab 2025-09-07T09:22:32.7986601Z deleted: sha256:2c5bc1dc49446d7df5784578ae7c99460a93b502aa0c3b9deffbb95ec5216860 2025-09-07T09:22:32.7986959Z deleted: sha256:bae1e956be98416ce7d1a6c2c6ef0917f467238e19291786f8e1fed36fa81956 2025-09-07T09:22:32.7987310Z deleted: sha256:2cb1f002ab1126b0606999a9557b3f7f5da1e453d5376d29d95d60a979a215c4 2025-09-07T09:22:32.7987658Z deleted: sha256:25055a5f67b9bce8fac50ee1508dcb0f862ed154de5ded734e55f60edaca385f 2025-09-07T09:22:32.7988011Z deleted: sha256:98024e2dd34a5899240e41ae14f59c657cdc005040773e6ad7cfe3d67cdac7a8 2025-09-07T09:22:32.7988361Z deleted: sha256:8d2e75659096b4af8a20c3e9a6cce899b6e720f638eacdfd7d41ec8a736efdde 2025-09-07T09:22:32.7988711Z deleted: sha256:7741a6bf043548509c51c32e44734f30dfe07f91ca56c64422b004c3c0444e68 2025-09-07T09:22:32.7989066Z deleted: sha256:e2e63edbd2512e413c388888eabade05a2a7876adf20e7f0e0c3660ac3acbd3d 2025-09-07T09:22:32.7989418Z deleted: sha256:7fdea0f7711ee22084f87dc6d651598b5e5c5237de828105f698cb6a937d4c9c 2025-09-07T09:22:32.7989768Z deleted: sha256:486a2cf42f9492f291d59d48f3cec5a0a72449d8b6ad7d7a02596da237cdd154 2025-09-07T09:22:32.7990123Z deleted: sha256:a17da64c93a4939fad81a3ff6b6cb30f988176a6e0062fcf9c65e06cd9b9c3fb 2025-09-07T09:22:32.7990479Z deleted: sha256:70b4a3a917b8f95b19ae5dab6f404af8fa1c886022e4a1d785654013d5d876af 2025-09-07T09:22:32.7990834Z deleted: sha256:bd1b9d6a8aa636a67023800dcd85e4a3a7a7a21d65c6e6491d169fa65b4404a9 2025-09-07T09:22:32.7991270Z deleted: sha256:e3befcf3d3693c1d7bf0535e6e6722f0aabb0123805443ef5915dd5441ed0b00 2025-09-07T09:22:32.7991629Z deleted: sha256:4b4f846f1c4266b015f5fdf8dac5346c083c3aee2375e337172c112677c5a8c0 2025-09-07T09:22:32.7991983Z deleted: sha256:f05dc4d1350267b90e07af241a64f86a928fb3d8de75717ac04ec5a0433d042f 2025-09-07T09:22:32.7992338Z deleted: sha256:b6b4de696915fa2db09844ec9ac44dbb2940b655cd356404cf1ff03eec644dad 2025-09-07T09:22:32.7992733Z deleted: sha256:da008bbe1fc29cb35b3949040e97eb801f3264a56c4dd1b9d43a3cb54f2a39b2 2025-09-07T09:22:32.7993092Z deleted: sha256:261da5d14cad99ee11dcdaeb6055726f38fc12b7c559ee9c6d2ddc3f288f4828 2025-09-07T09:22:32.7993446Z deleted: sha256:16f900c60e70d685a85ca571ee0dada993a02217bdd6bb8b1d49169e7e28cf41 2025-09-07T09:22:32.7993807Z deleted: sha256:f57b18c5cde1d1dc553a15e1e98141d4afc0b4d0bb1182cc85b2c21bd18bb783 2025-09-07T09:22:32.7994154Z deleted: sha256:3c79105088ac60b231e4553752ee42cb6a87f9d32736b32f0c2123dddec724e7 2025-09-07T09:22:32.7994501Z deleted: sha256:df1ffff478908236efb6ceb8e05e6e078f12b864f4d24ce598cba7b961fad65c 2025-09-07T09:22:32.7994853Z deleted: sha256:8170255b562b59b76768f18a5b84b1ba887db93d3fe43b87a74bdc6be4f82014 2025-09-07T09:22:32.7995200Z deleted: sha256:c863cfe6bed704be5a54617331e27158b6f5a492dd6b9ed9c99d23db017cf5e1 2025-09-07T09:22:32.7995558Z deleted: sha256:e9e5a98c073f72c3abf9cc98724a31a3791535574ac78aeda7eb5df4580b21d0 2025-09-07T09:22:32.7995910Z deleted: sha256:0a42ac98735ca6578911218be7a7918001fe8aee1eb33d98f0d0a153d0e1102d 2025-09-07T09:22:32.7996252Z deleted: sha256:77d5a8aaa4d0fe1210dda9ac1f0fa3cf6141fea925b6240b9839d7505d021d3f 2025-09-07T09:22:32.7996610Z deleted: sha256:fa6ec46c43532dc01449df1cc403de8bb5872f859076e90658534c51c1487ef9 2025-09-07T09:22:32.7997032Z deleted: sha256:424a12dd5083283e19af48d31b7f2e33911ca8f459796f17280eaf5777a9aa25 2025-09-07T09:22:32.7997387Z deleted: sha256:8f0499601e14f1073e20ce889b45d12ab33264f9cf30359ac29dddbf58a311aa 2025-09-07T09:22:32.7997783Z deleted: sha256:5a5fae32dfb81abcd7bf374018b11e8e42a5aa39841d4b94e822d306c9af015b 2025-09-07T09:22:32.7998151Z deleted: sha256:d1bda89f22d383d38dfb7f7590b3bb202ccb91814034e7c7e2493306a10151ef 2025-09-07T09:22:32.7998517Z deleted: sha256:dbf16c1fcae146528685a8f745f9c505b24ba9ef009c42b1bd711ff7bf51b936 2025-09-07T09:22:32.7998873Z deleted: sha256:f9ec0065788f638325536a37427e2635b760a32457f20ca0acbcef6946b1041b 2025-09-07T09:22:32.7999229Z deleted: sha256:9d9911dac8fb2ff7db87329f38625d73f452dfef8822830048bbc00541c7df14 2025-09-07T09:22:32.7999578Z deleted: sha256:de4c1937129850e357b0de484d230569f628ac0bc883b12eff42932cd1e193ce 2025-09-07T09:22:32.7999942Z deleted: sha256:7b3c9e5b56a1d74226a5c1a54e5cb5e749012aa9b1d2376c6e7503757e29c35b 2025-09-07T09:22:32.8000301Z deleted: sha256:8062a6f28fc5fe2a199e1c1c40b6c43b7e29eb0c452492b47ec6900413b19cb6 2025-09-07T09:22:32.8000666Z deleted: sha256:f879aeffe6886f8da80462b571f9307aa63bb961645bec55ff579187a81cfd0b 2025-09-07T09:22:32.8001028Z deleted: sha256:5c6ef06b3536a430194aee509a784ee889c4a9d6248cb20fd9290e87e4ee2245 2025-09-07T09:22:32.8001387Z deleted: sha256:461aea034a25a2d72be6adfe9213c457c4cbf48724e9cb1c57987afb87668f21 2025-09-07T09:22:32.8001759Z deleted: sha256:e342cd1c71b7d0b024ea16b4a11f3f7fbbc2e3d11ef754c9d242aa50c4f8b0a3 2025-09-07T09:22:32.8002137Z deleted: sha256:bffd35a7fa1ddcfe05f79b7d3cae4180928eeea00eaab7ed7f484bc31adfc1d5 2025-09-07T09:22:32.8002509Z deleted: sha256:b34e33e7b04b5cbb5d5852199430593bfa18ddfe9081df42284230a14ebb739e 2025-09-07T09:22:32.8002867Z deleted: sha256:21d9b55338774d9ddc66d0bfcc92af9c8d2ecd94d1710b7049f5a811e411af7b 2025-09-07T09:22:32.8003225Z deleted: sha256:6cc2b33909585d17bf269fb8297ff881249e136137254734f7d23b9583208718 2025-09-07T09:22:32.8003585Z deleted: sha256:ca7f55b7c6d6cb11ddd8e187da34c2695fc2ce7655d652b9c9dc140a01ed056f 2025-09-07T09:22:32.8003957Z deleted: sha256:a3ece3d0ab6e99ef783c4f8d27d0e38504ab4477590ef556c16d22d92ba63a43 2025-09-07T09:22:32.8004319Z deleted: sha256:c137b0d41177c753aa1b69b11d0dd1f82420bf8520371866c845b53dca10b2d0 2025-09-07T09:22:32.8004686Z deleted: sha256:1e0d92b07bce12e511af59f608edd1932b10704d700f5e7538e406b90ecbb615 2025-09-07T09:22:32.8005098Z deleted: sha256:2ec3d01b3031e9da124d67410f54866ec5c679a0d6e4aee6b31608c45ce7fd77 2025-09-07T09:22:32.8005464Z deleted: sha256:308cffbd71363688c672b2043c6b9bf647cfb84593c42c3d88e3f36ee8f7f1b4 2025-09-07T09:22:32.8005904Z deleted: sha256:d965d9873fa450daba50a85d961f0835b14374167d84cfafa6060d16229f4229 2025-09-07T09:22:32.8006327Z deleted: sha256:effd997e222f62a34133bb2ecf9c0ffee151e5797f72e734d86a270d2e722374 2025-09-07T09:22:32.8006685Z deleted: sha256:0bbc1c78c10ee09c2697cfcce347dc9edbf82a7ccc25a6db6ee0a8dda398f7f2 2025-09-07T09:22:32.8007047Z deleted: sha256:214858e773d1ad73c2965c19b29cbfd3e2a974daa879163e1c1eb96567a7ee06 2025-09-07T09:22:32.8007410Z deleted: sha256:a9c7a2cd7ae229b26e84c093de657d0f4334d6cc9301991c6c3245ff62a9a71d 2025-09-07T09:22:32.8007764Z deleted: sha256:749a80551ef3f272e2517cb065bc7a5250da47d0b36bf74ed453caa9a5fee265 2025-09-07T09:22:32.8008102Z deleted: sha256:39b014c4e62d21c11df6c6d775d3f345675014292198981f455bacc4515a0f7b 2025-09-07T09:22:32.8008446Z deleted: sha256:0f087c9a894566644f825f5f87308d92e4cf149c51f7cd4769cbfaeefd3df791 2025-09-07T09:22:32.8008799Z deleted: sha256:dc6eb6dad5f9e332f00af553440e857b1467db1be43dd910cdb6830ba0898d50 2025-09-07T09:22:32.8009009Z 2025-09-07T09:22:32.8009099Z Total reclaimed space: 83.93GB 2025-09-07T09:22:32.8079846Z Post job cleanup. 2025-09-07T09:22:32.8105663Z Post job cleanup. 2025-09-07T09:22:32.8885383Z [command]/usr/bin/git version 2025-09-07T09:22:32.8940824Z git version 2.47.1 2025-09-07T09:22:32.8964528Z Copying '/home/ec2-user/.gitconfig' to '/home/ec2-user/actions-runner/_work/_temp/39889fcc-4ed6-48a4-b574-e74d4e63d414/.gitconfig' 2025-09-07T09:22:32.8971810Z Temporarily overriding HOME='/home/ec2-user/actions-runner/_work/_temp/39889fcc-4ed6-48a4-b574-e74d4e63d414' before making global git config changes 2025-09-07T09:22:32.8972310Z Adding repository directory to the temporary git global config as a safe directory 2025-09-07T09:22:32.8975278Z [command]/usr/bin/git config --global --add safe.directory /home/ec2-user/actions-runner/_work/pytorch/pytorch 2025-09-07T09:22:32.9004989Z [command]/usr/bin/git config --local --name-only --get-regexp core\.sshCommand 2025-09-07T09:22:32.9033131Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'core\.sshCommand' && git config --local --unset-all 'core.sshCommand' || :" 2025-09-07T09:22:32.9299845Z Entering 'android/libs/fbjni' 2025-09-07T09:22:32.9348810Z Entering 'third_party/FP16' 2025-09-07T09:22:32.9393982Z Entering 'third_party/FXdiv' 2025-09-07T09:22:32.9436000Z Entering 'third_party/NNPACK' 2025-09-07T09:22:32.9481250Z Entering 'third_party/NVTX' 2025-09-07T09:22:32.9525767Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T09:22:32.9570550Z Entering 'third_party/XNNPACK' 2025-09-07T09:22:32.9625741Z Entering 'third_party/aiter' 2025-09-07T09:22:32.9668926Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T09:22:32.9720515Z Entering 'third_party/benchmark' 2025-09-07T09:22:32.9763568Z Entering 'third_party/composable_kernel' 2025-09-07T09:22:32.9817513Z Entering 'third_party/cpp-httplib' 2025-09-07T09:22:32.9861473Z Entering 'third_party/cpuinfo' 2025-09-07T09:22:32.9905219Z Entering 'third_party/cudnn_frontend' 2025-09-07T09:22:32.9949625Z Entering 'third_party/cutlass' 2025-09-07T09:22:33.0003748Z Entering 'third_party/fbgemm' 2025-09-07T09:22:33.0046035Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T09:22:33.0088041Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T09:22:33.0136076Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T09:22:33.0182735Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T09:22:33.0227469Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T09:22:33.0264021Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T09:22:33.0306679Z Entering 'third_party/fbgemm/external/json' 2025-09-07T09:22:33.0350980Z Entering 'third_party/flash-attention' 2025-09-07T09:22:33.0393931Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T09:22:33.0437207Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T09:22:33.0481674Z Entering 'third_party/flatbuffers' 2025-09-07T09:22:33.0530643Z Entering 'third_party/fmt' 2025-09-07T09:22:33.0574897Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T09:22:33.0619243Z Entering 'third_party/gloo' 2025-09-07T09:22:33.0664409Z Entering 'third_party/googletest' 2025-09-07T09:22:33.0708094Z Entering 'third_party/ideep' 2025-09-07T09:22:33.0747446Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T09:22:33.0798108Z Entering 'third_party/ittapi' 2025-09-07T09:22:33.0840294Z Entering 'third_party/kineto' 2025-09-07T09:22:33.0886114Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T09:22:33.0925988Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T09:22:33.0967611Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T09:22:33.1010888Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T09:22:33.1058319Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T09:22:33.1104142Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T09:22:33.1150809Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T09:22:33.1192818Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T09:22:33.1237115Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T09:22:33.1281768Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T09:22:33.1327687Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T09:22:33.1370771Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T09:22:33.1415907Z Entering 'third_party/kleidiai' 2025-09-07T09:22:33.1457417Z Entering 'third_party/mimalloc' 2025-09-07T09:22:33.1502922Z Entering 'third_party/nlohmann' 2025-09-07T09:22:33.1550641Z Entering 'third_party/onnx' 2025-09-07T09:22:33.1607596Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T09:22:33.1656715Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T09:22:33.1703554Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T09:22:33.1742622Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T09:22:33.1784620Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T09:22:33.1831001Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T09:22:33.1870450Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T09:22:33.1913091Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T09:22:33.1957221Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T09:22:33.2002017Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T09:22:33.2045876Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T09:22:33.2091842Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T09:22:33.2152193Z Entering 'third_party/pocketfft' 2025-09-07T09:22:33.2195935Z Entering 'third_party/protobuf' 2025-09-07T09:22:33.2243574Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T09:22:33.2288097Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T09:22:33.2333877Z Entering 'third_party/psimd' 2025-09-07T09:22:33.2377767Z Entering 'third_party/pthreadpool' 2025-09-07T09:22:33.2421672Z Entering 'third_party/pybind11' 2025-09-07T09:22:33.2465568Z Entering 'third_party/python-peachpy' 2025-09-07T09:22:33.2508911Z Entering 'third_party/sleef' 2025-09-07T09:22:33.2551397Z Entering 'third_party/tensorpipe' 2025-09-07T09:22:33.2595905Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T09:22:33.2640689Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T09:22:33.2685992Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T09:22:33.2726111Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T09:22:33.2767294Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T09:22:33.2826477Z [command]/usr/bin/git config --local --name-only --get-regexp http\.https\:\/\/github\.com\/\.extraheader 2025-09-07T09:22:33.2843847Z http.https://github.com/.extraheader 2025-09-07T09:22:33.2850702Z [command]/usr/bin/git config --local --unset-all http.https://github.com/.extraheader 2025-09-07T09:22:33.2874698Z [command]/usr/bin/git submodule foreach --recursive sh -c "git config --local --name-only --get-regexp 'http\.https\:\/\/github\.com\/\.extraheader' && git config --local --unset-all 'http.https://github.com/.extraheader' || :" 2025-09-07T09:22:33.3114049Z Entering 'android/libs/fbjni' 2025-09-07T09:22:33.3143391Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3169056Z Entering 'third_party/FP16' 2025-09-07T09:22:33.3199053Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3227686Z Entering 'third_party/FXdiv' 2025-09-07T09:22:33.3257320Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3282174Z Entering 'third_party/NNPACK' 2025-09-07T09:22:33.3312279Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3338904Z Entering 'third_party/NVTX' 2025-09-07T09:22:33.3370909Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3401518Z Entering 'third_party/VulkanMemoryAllocator' 2025-09-07T09:22:33.3434131Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3462173Z Entering 'third_party/XNNPACK' 2025-09-07T09:22:33.3492769Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3528812Z Entering 'third_party/aiter' 2025-09-07T09:22:33.3557199Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3586112Z Entering 'third_party/aiter/3rdparty/composable_kernel' 2025-09-07T09:22:33.3616671Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3649834Z Entering 'third_party/benchmark' 2025-09-07T09:22:33.3681469Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3708899Z Entering 'third_party/composable_kernel' 2025-09-07T09:22:33.3736183Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3765946Z Entering 'third_party/cpp-httplib' 2025-09-07T09:22:33.3795773Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3822039Z Entering 'third_party/cpuinfo' 2025-09-07T09:22:33.3853709Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3884339Z Entering 'third_party/cudnn_frontend' 2025-09-07T09:22:33.3915990Z http.https://github.com/.extraheader 2025-09-07T09:22:33.3944135Z Entering 'third_party/cutlass' 2025-09-07T09:22:33.3973960Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4006798Z Entering 'third_party/fbgemm' 2025-09-07T09:22:33.4036478Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4064864Z Entering 'third_party/fbgemm/external/asmjit' 2025-09-07T09:22:33.4096298Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4124036Z Entering 'third_party/fbgemm/external/composable_kernel' 2025-09-07T09:22:33.4154713Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4188382Z Entering 'third_party/fbgemm/external/cpuinfo' 2025-09-07T09:22:33.4217506Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4244461Z Entering 'third_party/fbgemm/external/cutlass' 2025-09-07T09:22:33.4275990Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4306974Z Entering 'third_party/fbgemm/external/googletest' 2025-09-07T09:22:33.4335580Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4360680Z Entering 'third_party/fbgemm/external/hipify_torch' 2025-09-07T09:22:33.4394296Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4420697Z Entering 'third_party/fbgemm/external/json' 2025-09-07T09:22:33.4451727Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4481543Z Entering 'third_party/flash-attention' 2025-09-07T09:22:33.4510675Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4538566Z Entering 'third_party/flash-attention/csrc/composable_kernel' 2025-09-07T09:22:33.4564994Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4596847Z Entering 'third_party/flash-attention/csrc/cutlass' 2025-09-07T09:22:33.4625035Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4657331Z Entering 'third_party/flatbuffers' 2025-09-07T09:22:33.4688837Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4715248Z Entering 'third_party/fmt' 2025-09-07T09:22:33.4745627Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4771837Z Entering 'third_party/gemmlowp/gemmlowp' 2025-09-07T09:22:33.4803023Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4830618Z Entering 'third_party/gloo' 2025-09-07T09:22:33.4859383Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4889113Z Entering 'third_party/googletest' 2025-09-07T09:22:33.4920135Z http.https://github.com/.extraheader 2025-09-07T09:22:33.4946408Z Entering 'third_party/ideep' 2025-09-07T09:22:33.4976523Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5001732Z Entering 'third_party/ideep/mkl-dnn' 2025-09-07T09:22:33.5029672Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5062405Z Entering 'third_party/ittapi' 2025-09-07T09:22:33.5092877Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5117649Z Entering 'third_party/kineto' 2025-09-07T09:22:33.5150939Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5179676Z Entering 'third_party/kineto/libkineto/third_party/dynolog' 2025-09-07T09:22:33.5210309Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5235631Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/DCGM' 2025-09-07T09:22:33.5262948Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5290192Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/cpr' 2025-09-07T09:22:33.5316882Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5340770Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/fmt' 2025-09-07T09:22:33.5367296Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5392835Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags' 2025-09-07T09:22:33.5419725Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5446836Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/gflags/doc' 2025-09-07T09:22:33.5474760Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5503177Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/glog' 2025-09-07T09:22:33.5531493Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5558945Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/googletest' 2025-09-07T09:22:33.5589363Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5614169Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/json' 2025-09-07T09:22:33.5642447Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5668480Z Entering 'third_party/kineto/libkineto/third_party/dynolog/third_party/pfs' 2025-09-07T09:22:33.5698695Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5725189Z Entering 'third_party/kineto/libkineto/third_party/fmt' 2025-09-07T09:22:33.5752177Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5780397Z Entering 'third_party/kineto/libkineto/third_party/googletest' 2025-09-07T09:22:33.5807323Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5837652Z Entering 'third_party/kleidiai' 2025-09-07T09:22:33.5867227Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5894875Z Entering 'third_party/mimalloc' 2025-09-07T09:22:33.5923198Z http.https://github.com/.extraheader 2025-09-07T09:22:33.5947843Z Entering 'third_party/nlohmann' 2025-09-07T09:22:33.5976236Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6004792Z Entering 'third_party/onnx' 2025-09-07T09:22:33.6037501Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6079615Z Entering 'third_party/onnx/third_party/pybind11' 2025-09-07T09:22:33.6109033Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6138239Z Entering 'third_party/opentelemetry-cpp' 2025-09-07T09:22:33.6167175Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6193897Z Entering 'third_party/opentelemetry-cpp/third_party/benchmark' 2025-09-07T09:22:33.6221536Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6245027Z Entering 'third_party/opentelemetry-cpp/third_party/googletest' 2025-09-07T09:22:33.6274671Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6302108Z Entering 'third_party/opentelemetry-cpp/third_party/ms-gsl' 2025-09-07T09:22:33.6327651Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6352837Z Entering 'third_party/opentelemetry-cpp/third_party/nlohmann-json' 2025-09-07T09:22:33.6382179Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6408384Z Entering 'third_party/opentelemetry-cpp/third_party/opentelemetry-proto' 2025-09-07T09:22:33.6439828Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6466335Z Entering 'third_party/opentelemetry-cpp/third_party/opentracing-cpp' 2025-09-07T09:22:33.6497766Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6520859Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp' 2025-09-07T09:22:33.6548327Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6574745Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/civetweb' 2025-09-07T09:22:33.6606560Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6634088Z Entering 'third_party/opentelemetry-cpp/third_party/prometheus-cpp/3rdparty/googletest' 2025-09-07T09:22:33.6665953Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6694887Z Entering 'third_party/opentelemetry-cpp/tools/vcpkg' 2025-09-07T09:22:33.6725355Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6767070Z Entering 'third_party/pocketfft' 2025-09-07T09:22:33.6797125Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6820003Z Entering 'third_party/protobuf' 2025-09-07T09:22:33.6848491Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6876630Z Entering 'third_party/protobuf/third_party/benchmark' 2025-09-07T09:22:33.6905962Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6933961Z Entering 'third_party/protobuf/third_party/googletest' 2025-09-07T09:22:33.6963689Z http.https://github.com/.extraheader 2025-09-07T09:22:33.6995019Z Entering 'third_party/psimd' 2025-09-07T09:22:33.7024324Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7047595Z Entering 'third_party/pthreadpool' 2025-09-07T09:22:33.7077465Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7104805Z Entering 'third_party/pybind11' 2025-09-07T09:22:33.7135958Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7165562Z Entering 'third_party/python-peachpy' 2025-09-07T09:22:33.7195254Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7221376Z Entering 'third_party/sleef' 2025-09-07T09:22:33.7252865Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7280132Z Entering 'third_party/tensorpipe' 2025-09-07T09:22:33.7310266Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7339665Z Entering 'third_party/tensorpipe/third_party/googletest' 2025-09-07T09:22:33.7370696Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7399175Z Entering 'third_party/tensorpipe/third_party/libnop' 2025-09-07T09:22:33.7430480Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7456289Z Entering 'third_party/tensorpipe/third_party/libuv' 2025-09-07T09:22:33.7486577Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7511363Z Entering 'third_party/tensorpipe/third_party/pybind11' 2025-09-07T09:22:33.7539555Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7564662Z Entering 'third_party/tensorpipe/third_party/pybind11/tools/clang' 2025-09-07T09:22:33.7597150Z http.https://github.com/.extraheader 2025-09-07T09:22:33.7694827Z A job completed hook has been configured by the self-hosted runner administrator 2025-09-07T09:22:33.7706163Z ##[group]Run '/home/ec2-user/runner-scripts/after_job.sh' 2025-09-07T09:22:33.7709374Z shell: /usr/bin/bash --noprofile --norc -e -o pipefail {0} 2025-09-07T09:22:33.7709625Z ##[endgroup] 2025-09-07T09:22:33.7822421Z [!ALERT!] Swap in detected! [!ALERT!] 2025-09-07T09:22:41.0590902Z [!ALERT!] Swap out detected [!ALERT!] 2025-09-07T09:22:53.4680773Z Cleaning up orphan processes